• Title/Summary/Keyword: Multi-touch recognition

Search Result 33, Processing Time 0.023 seconds

Contents Development of IrobiQ on School Violence Prevention Program for Young Children (지능형 로봇 아이로비큐(IrobiQ)를 활용한 학교폭력 예방 프로그램 개발)

  • Hyun, Eunja;Lee, Hawon;Yeon, Hyemin
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.9
    • /
    • pp.455-466
    • /
    • 2013
  • The purpose of this study was to develop a school violence prevention program "Modujikimi" for young children to be embedded in IrobiQ, the teacher assistive robot. The themes of this program consisted of basic character education, bullying prevention education and sexual violence prevention education. The activity types included large group, individual and small group activities, free choice activities, and finally parents' education, which included poems, fairy tales, music, art, sharing stories. Finally, the multi modal functions of the robot were employed: image on the screen, TTS (Text To Speech), touch function, recognition of sound and recording system. The robot content was demonstrated to thirty early childhood educators whose acceptability of the content was measured using questionnaires. And also the content was applied to children in daycare center. As a result, majority of them responded positively in acceptability. The results of this study suggest that the further research is needed to improve two-way interactivity of teacher assistive robot.

A Study on the Windows Application Control Model Based on Leap Motion (립모션 기반의 윈도우즈 애플리케이션 제어 모델에 관한 연구)

  • Kim, Won
    • Journal of the Korea Convergence Society
    • /
    • v.10 no.11
    • /
    • pp.111-116
    • /
    • 2019
  • With recent rapid development of computer capabilities, various technologies that can facilitate the interaction between humans and computers are being studied. The paradigm tends to change to NUI using the body such as 3D motion, haptics, and multi-touch with GUI using traditional input devices. Various studies have been conducted on transferring human movements to computers using sensors. In addition to the development of optical sensors that can acquire 3D objects, the range of applications in the industrial, medical, and user interface fields has been expanded. In this paper, I provide a model that can execute other programs through gestures instead of the mouse, which is the default input device, and control Windows based on the lip motion. To propose a model which converges with an Android application and can be controlled by various media and voice instruction functions using voice recognition and buttons through connection with a main client. It is expected that Internet media such as video and music can be controlled not only by a client computer but also by an application at a long distance and that convenient media viewing can be performed through the proposal model.

A Conversational Interactive Tactile Map for the Visually Impaired (시각장애인의 길 탐색을 위한 대화형 인터랙티브 촉각 지도 개발)

  • Lee, Yerin;Lee, Dongmyeong;Quero, Luis Cavazos;Bartolome, Jorge Iranzo;Cho, Jundong;Lee, Sangwon
    • Science of Emotion and Sensibility
    • /
    • v.23 no.1
    • /
    • pp.29-40
    • /
    • 2020
  • Visually impaired people use tactile maps to get spatial information about their surrounding environment, find their way, and improve their independent mobility. However, classical tactile maps that make use of braille to describe the location within the map have several limitations, such as the lack of information due to constraints on space and limited feedback possibilities. This study describes the development of a new multi-modal interactive tactile map interface that addresses the challenges of tactile maps to improve the usability and independence of visually impaired people when using tactile maps. This interface adds touch gesture recognition to the surface of tactile maps and enables the users to verbally interact with a voice agent to receive feedback and information about navigation routes and points of interest. A low-cost prototype was developed to conduct usability tests that evaluated the interface through a survey and interview given to blind participants after using the prototype. The test results show that this interactive tactile map prototype provides improved usability for people over traditional tactile maps that use braille only. Participants reported that it was easier to find the starting point and points of interest they wished to navigate to with the prototype. Also, it improved self-reported independence and confidence compared with traditional tactile maps. Future work includes further development of the mobility solution based on the feedback received and an extensive quantitative study.