• 제목/요약/키워드: Gestures

검색결과 484건 처리시간 0.025초

A Unit Touch Gesture Model of Performance Time Prediction for Mobile Devices

  • Kim, Damee;Myung, Rohae
    • 대한인간공학회지
    • /
    • 제35권4호
    • /
    • pp.277-291
    • /
    • 2016
  • Objective: The aim of this study is to propose a unit touch gesture model, which would be useful to predict the performance time on mobile devices. Background: When estimating usability based on Model-based Evaluation (MBE) in interfaces, the GOMS model measured 'operators' to predict the execution time in the desktop environment. Therefore, this study used the concept of operator in GOMS for touch gestures. Since the touch gestures are comprised of possible unit touch gestures, these unit touch gestures can predict to performance time with unit touch gestures on mobile devices. Method: In order to extract unit touch gestures, manual movements of subjects were recorded in the 120 fps with pixel coordinates. Touch gestures are classified with 'out of range', 'registration', 'continuation' and 'termination' of gesture. Results: As a results, six unit touch gestures were extracted, which are hold down (H), Release (R), Slip (S), Curved-stroke (Cs), Path-stroke (Ps) and Out of range (Or). The movement time predicted by the unit touch gesture model is not significantly different from the participants' execution time. The measured six unit touch gestures can predict movement time of undefined touch gestures like user-defined gestures. Conclusion: In conclusion, touch gestures could be subdivided into six unit touch gestures. Six unit touch gestures can explain almost all the current touch gestures including user-defined gestures. So, this model provided in this study has a high predictive power. The model presented in the study could be utilized to predict the performance time of touch gestures. Application: The unit touch gestures could be simply added up to predict the performance time without measuring the performance time of a new gesture.

Towards Establishing a Touchless Gesture Dictionary based on User Participatory Design

  • Song, Hae-Won;Kim, Huhn
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.515-523
    • /
    • 2012
  • Objective: The aim of this study is to investigate users' intuitive stereotypes on non-touch gestures and establish the gesture dictionary that can be applied to gesture-based interaction designs. Background: Recently, the interaction based on non-touch gestures is emerging as an alternative for natural interactions between human and systems. However, in order for non-touch gestures to become a universe interaction method, the studies on what kinds of gestures are intuitive and effective should be prerequisite. Method: In this study, as applicable domains of non-touch gestures, four devices(i.e. TV, Audio, Computer, Car Navigation) and sixteen basic operations(i.e. power on/off, previous/next page, volume up/down, list up/down, zoom in/out, play, cancel, delete, search, mute, save) were drawn from both focus group interview and survey. Then, a user participatory design was performed. The participants were requested to design three gestures suitable to each operation in the devices, and they evaluated intuitiveness, memorability, convenience, and satisfaction of their derived gestures. Through the participatory design, agreement scores, frequencies and planning times of each distinguished gesture were measured. Results: The derived gestures were not different in terms of four devices. However, diverse but common gestures were derived in terms of kinds of operations. In special, manipulative gestures were suitable for all kinds of operations. On the contrary, semantic or descriptive gestures were proper to one-shot operations like power on/off, play, cancel or search. Conclusion: The touchless gesture dictionary was established by mapping intuitive and valuable gestures onto each operation. Application: The dictionary can be applied to interaction designs based on non-touch gestures. Moreover, it will be used as a basic reference for standardizing non-touch gestures.

Descriptive Characteristics of Systematic Functional Gestures Used by Pre-Service Earth Science Teachers in Classroom Learning Environments

  • Yoon-Sung Choi
    • 한국지구과학회지
    • /
    • 제45권4호
    • /
    • pp.377-391
    • /
    • 2024
  • This study aimed to explore the characteristics and dimensions of of systematic functional gestures employed by pre-service Earth science teachers during instructional sessions. Data were collected from eight students enrolled in a university's Department of Earth Science Education. The data included lesson plans, activity sheets, and recordings of one class session from participants. The analysis, conducted using the systemic functional multimodal discourse analysis framework, categorized gestures into scientific and social functional dimensions. Further subdivision identified meta gestures, analytical gestures, and interrelated gestures. Additionally, pre-service teachers used gestures to explain scientific concepts, concretely represent ideas and facilitate communication during instruction. This study emphasizes the nonverbal strategies used by pre-service Earth science teachers, highlighting the importance of noverbal communication in teachers' professional development and the need for its integration into education. It also establishes a systematic conceptual framework for understanding gestures in the instructional context.

초등학생의 과학 담화에서 나타나는 몸짓의 유형과 특징 (The Types and Features of Gestures in Science Discourse of Elementary Students)

  • 나지연;송진웅
    • 한국초등과학교육학회지:초등과학교육
    • /
    • 제31권4호
    • /
    • pp.450-462
    • /
    • 2012
  • Gestures are a common phenomenon of human communication. There exists little research concerned with the gestures in science education, and most researches of gestures have focused on individual gestures. However, learning occurs through sociocultural interactions with friends, family, teachers, and others in society. Hence, the purpose of this study was to investigate and identify the types and features of gestures which were made by elementary students to communicate with peers in science discourse. A group of six fourth-graders was observed in eight science discourses where they talked about ideas related to thermal concepts. The data was collected through interviews and questionnaires. The analysis of the data showed that students' gestures in science discourses could be classified into seven types: signal iconic gesture, illustrative iconic gesture, personal deictic gesture, object deictic gesture, beat gesture, emotional metaphoric gesture, and content metaphoric gesture. It was also found that these gestures had functions of repeating, supplementing, and replacing utterance to communicate with others. Students frequently expressed scientific terms metaphorically as everyday terms through their gestures. Gestures were shared, imitated, and transferred in the communication process, and students' gestures also made influence on other students' ideas through these processes.

Interacting with Touchless Gestures: Taxonomy and Requirements

  • Kim, Huhn
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.475-481
    • /
    • 2012
  • Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures' natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • 대한인간공학회지
    • /
    • 제31권5호
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

계절의 변화 원인에 대한 초등학생들의 설명에서 확인된 정신 모델과 묘사적 몸짓의 관계 분석 (The Relationship between the Mental Model and the Depictive Gestures Observed in the Explanations of Elementary School Students about the Reason Why Seasons change)

  • 김나영;양일호;고민석
    • 대한지구과학교육학회지
    • /
    • 제7권3호
    • /
    • pp.358-370
    • /
    • 2014
  • The purpose of this study is to analyze the relationship between the mental model and the depictive gestures observed in the explanations of elementary school students about the reason why seasons change. As a result of analysis in gestures of each mental model, mental model was remembered as "motion" in case of CM-type, and showed more "Exphoric" gestures that expressed gesture as a language. CF type is remembered in "writings or pictures," and metaphoric gestures were used when explaining some alternative concepts. CF-UM type explained with language in detail, and showed a number of gestures with "Lexical." Analyzing depictive gestures, even with sub-categories such as rotation, revolution and meridian altitude, etc., a great many types of gestures were expressed such as indicating with fingers, palms, arms, ball-point pens, and fists, etc., or drawing, spinning and indicating them. We could check up concept understandings of the students through this. In addition, as we analyzed inconsistencies among external representations such as verbal language and gesture, writing and gesture, and picture and gesture, we realized that gestures can help understanding mental models of the students, and sometimes, we could know that information that cannot be shown by linguistic explanations or pictures was expressed in gestures. Additionally, we looked into two research participants that showed conspicuous differences. One participant seemed to be wrong as he used his own expressions, but he expressed with gestures precisely, while the other participant seemed to be accurate, but when he analyzed gestures, he had whimsical concepts.

어휘인출과 구어동반 제스처의 관계 (The Relationship between Lexical Retrieval and Coverbal Gestures)

  • 하지완;심현섭
    • 인지과학
    • /
    • 제22권2호
    • /
    • pp.123-143
    • /
    • 2011
  • 본 연구의 목적은 구어동반 제스처가 어휘인출과정의 개념화와 어휘화 가운데 어떠한 단계와 관계가 있는지를 알아보고자 하는 것이다. 제스처와 발화 분석을 위하여, 모 방송국의 TV 버라이어티 프로그램 중 제시된 목표단어의 의미를 설명하여 전화상대방이 그 단어의 이름을 맞추게 하는 게임의 동영상 자료를 이용하였다. 분석자료로 이와 같은 TV 자료를 선택한 이유는 프로그램의 게임과제가 어휘인출의 개념화 과정과 어휘화 과정을 동시에 유도할 수 있는 과제로 판단되었기 때문이다. 20회의 동영상 자료를 재생하여 목표단어와 목표단어 설명과정에서 출연자들이 산출한 발화를 전사하고, 출연자들이 사용한 제스처를 어휘 제스처(lexical gesture)와 운동 제스처(motor gesture)로 구분하여 기록하였다. 구어동반 제스처가 어휘인출과정의 개념화와 관계가 있는지 알아보기 위하여, 구체적 단어와 추상적 단어 설명 시 동반된 제스처의 사용양상이 다른지, 그리고 단어 개념의 난이도와 제스처 양 사이에 상관관계가 있는지 분석하였다. 제스처가 어휘인출과정의 어휘화와 관계가 있는지 알아보기 위하여, 출연자가 목표단어 설명 시 산출한 발화의 단어 양과 제스처 양, 그리고 저빈도어 비율과 제스처 양 사이의 상관관계를 각각 살펴보았다. 연구결과 단어 개념의 심상성(imageability)에 따라 유의하게 많이 동반되는 제스처가 다르다는 것을 알 수 있었다. 즉 구체적 단어 설명 시에는 추상적 단어 설명 시보다 어휘 제스처가 유의하게 많이 동반되었고, 추상적 단어 설명 시에는 구체적 단어 설명 시보다 운동 제스처가 유의하게 많이 동반되었다. 또한 구체적 단어의 경우 개념의 난이도와 제스처 양 사이에 유의한 상관관계가 있었다. 그러나 목표단어 설명 시 산출된 발화의 단어 양과 제스처 양, 저빈도어 비율과 제스처 양 사이에는 상관관계가 나타나지 않았다. 본 연구의 결과는 구어동반 제스처가 어휘인출과정의 개념화부를 반영한다는 것을 시사한다. 뿐만 아니라 이제까지 많은 연구들에서 간과되어 왔던 운동 제스처의 기능에 대한 새로운 접근을 시도하였다는 점에서 본 연구는 의의가 있을 것이다.

  • PDF

과학 탐구에서 몸짓의 역할과 중요성 (The Role and Importance of Gesture in Science Exploration)

  • 한재영;최정훈;신영준;손정우;차정호;홍준의
    • 한국초등과학교육학회지:초등과학교육
    • /
    • 제25권1호
    • /
    • pp.51-58
    • /
    • 2006
  • The language and the gestures of a teacher, generally, have a great influence on the effect of a lesson. This is because subject content is transferred to students by teachers' language and gestures. In the science lessons which focus on experiments, the language and gestures of both students and teachers will help the learning of scientific content. However, the role of gestures, despite its importance, has rarely been investigated in science education research. The role of gestures of students and teachers is a much needed area of study. This study investigated the gestures observed in the experimental process performed by students who participated in a science exploration activity. Students' gestures play an essential role in the successful performance of the experiment. and they could function as a process of solving the contradictory situation. In addition, the demonstration and the communication of gestures should be performed very cautiously. There were a number of implications for the long-standing problem of the relation between the understanding of science concepts and the performance of experiments.

  • PDF

터치스크린 기반 웹브라우저 조작을 위한 손가락 제스처 개발 (Development of Finger Gestures for Touchscreen-based Web Browser Operation)

  • 남종용;최재호;정의승
    • 대한인간공학회지
    • /
    • 제27권4호
    • /
    • pp.109-117
    • /
    • 2008
  • Compared to the existing PC which uses a mouse and a keyboard, the touchscreen-based portable PC allows the user to use fingers, requiring new operation methods. However, current touchscreen-based web browser operations in many cases involve merely having fingers move simply like a mouse and click, or not corresponding well to the user's sensitivity and the structure of one's index finger, making itself difficult to be used during walking. Therefore, the goal of this study is to develop finger gestures which facilitate the interaction between the interface and the user, and make the operation easier. First, based on the frequency of usage in the web browser and preference, top eight functions were extracted. Then, the users' structural knowledge was visualized through sketch maps, and the finger gestures which were applicable in touchscreens were derived through the Meaning in Mediated Action method. For the front/back page, and up/down scroll functions, directional gestures were derived, and for the window closure, refresh, home and print functions, letter-type and icon-type gestures were drawn. A validation experiment was performed to compare the performance between existing operation methods and the proposed one in terms of execution time, error rate, and preference, and as a result, directional gestures and letter-type gestures showed better performance than the existing methods. These results suggest that not only during the operation of touchscreen-based web browser in portable PC but also during the operation of telematics-related functions in automobile, PDA and so on, the new gestures can be used to make operation easier and faster.