• Title/Summary/Keyword: user gestures

Search Result 156, Processing Time 0.102 seconds

Direction of Touch Gestures and Perception of Inner Scroll in Smartphone UI (스마트폰 UI에서 터치 제스처의 방향성과 이너 스크롤의 인지)

  • Lee, Young-Ju
    • Journal of Digital Convergence
    • /
    • v.19 no.2
    • /
    • pp.409-414
    • /
    • 2021
  • In this paper, we investigated the touch gestures of the scroll direction of a small and long UI due to the characteristics of a device in a smartphone environment that has become popular and used. Touch gestures are touched and directed by triggers such as metaphors and affordances based on past experiences. Different types of touch gestures are used depending on the type of navigation, motion, and transformation gesture, but scrolling is the most frequently used among them. In general, the scroll is vertically scrolled, but recently, a design pattern that can be scrolled left and right inside is arranged to cause cognitive dissonance of users. In the use of an inner scroll that can scroll left and right by covering a part of the right content, the mixing of a non-scrollable design pattern becomes a factor that requires attention to the user. Therefore, it was found that the use of triggers and the use of consistent design patterns can enhance the user experience even in the inner scroll environment.

A Notation Method for Three Dimensional Hand Gesture

  • Choi, Eun-Jung;Kim, Hee-Jin;Chung, Min-K.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.541-550
    • /
    • 2012
  • Objective: The aim of this study is to suggest a notation method for three-dimensional hand gesture. Background: To match intuitive gestures with commands of products, various studies have tried to derive gestures from users. In this case, various gestures for a command are derived due to various users' experience. Thus, organizing the gestures systematically and identifying similar pattern of them have become one of important issues. Method: Related studies about gesture taxonomy and notating sign language were investigated. Results: Through the literature review, a total of five elements of static gesture were selected, and a total of three forms of dynamic gesture were identified. Also temporal variability(reputation) was additionally selected. Conclusion: A notation method which follows a combination sequence of the gesture elements was suggested. Application: A notation method for three dimensional hand gestures might be used to describe and organize the user-defined gesture systematically.

User Gestures as a Voluntary Action in Products Design - Focused on a Gesture Discovered in User Positive Action to Transform Products (제품디자인에 있어서 자발적 행위로의 유저제스처 -사용자의 긍정적 제품변형행위에 관한 제스처를 중심으로-)

  • 진선태;우흥룡
    • Archives of design research
    • /
    • v.17 no.2
    • /
    • pp.95-104
    • /
    • 2004
  • Creativity is a important keyword for users as well as for main design organization who needs it. But little attention has been given to the aspect of user's creativity, also there has been a few attempt to apply it into design development until now. Nowadays in design areas, user's experiences and actions are changing the passive states receiving meanings into the active states creating meanings voluntarily. It is resonable to suppose that creative stage is important for users and they have the possibility of new ideas of uses and creating new productions. User's experiences of objects includes that of being formed or supported previously and that of voluntary interpretations acquired for himself, which it may be the possibilities predicted in design process or unknown user's action areas. It is likely that creative use process by themselves are the actions applied and deviated from usability and function by main design organization, also creative productions are arranged and made by users. These have a scope of examination and research in probability that is occurs frequently in user. In this research, approaching with a term, 'User gestures', User gestures are the characteristic action areas based on user's voluntary behaviors, where are revealed a unessential and non-operational function as a action itself and various transformation and creation of products as a outcome of action. This fact proves clearly that user gestures have a worth of alive spectrum to observe aspects of user culture and could be a attractive approach to seek easily new design concept for designer and developer. A further direction of this study will be following areas, Ethnography methods research of user gestures, Cultural research to phenomenon of user design and UGSBD(User gesture scenario based design) research. And it seems probable that they are applied in design development as follows, User initiative customization products, User participatory recycling products and creativity-experience design.

  • PDF

Design of Hand Gestures for Smart Home Appliances based on a User Centered Approach (스마트홈 내 기기와의 상호작용을 위한 사용자 중심의 핸드 제스처 도출)

  • Choi, Eun-Jung;Kwon, Sung-Hyuk;Lee, Dong-Hun;Lee, Ho-Jin;Chung, Min-K.
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.38 no.3
    • /
    • pp.182-190
    • /
    • 2012
  • With the progress of both wire and wireless home networking technology, various projects on smart home have been carried out in the world (Harper, 2003), and at the same time, new approaches to interact with smart home systems efficiently and effectively have also been investigated. A gesture-based interface is one of these approaches. Especially with advance of gesture recognition technologies, a variety of research studies on gesture interactions with the functions of IT devices have been conducted. However, there are few research studies which suggested and investigated the use of gestures for controlling smart home appliances. In this research the gestures for selected smart home appliances are suggested based on a user centered approach. A total of thirty-eight functions were selected, and a total of thirty participants generated gestures for each function. Based on the Nielsen (2004), Lee et al. (2010) and Kuhnel et al. (2011), the gesture with the highest frequency for each function (Top gesture) has been suggested and investigated.

Gesture Interaction Design based on User Preference for the Elastic Handheld Device

  • Yoo, Hoon Sik;Ju, Da Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.6
    • /
    • pp.519-533
    • /
    • 2016
  • Objective: This study lays its aims at the definition of relevant operation method and function by researching on the value to be brought when applying smart device that can hand carry soft and flexible materials like jelly. Background: New technology and material play a role in bringing type transformation of interface and change of operation system. Recently, importance has been increased on the study of Organic User Interface (OUI) that conducts research on the value of new method of input and output adopting soft and flexible materials for various instruments. Method: For fulfillment of the study, 27 kinds of gestures have been defined that are usable in handheld device based on existing studies. Quantitative research of survey was conducted of adult male and female of 20s through 30s and an analysis was done on the function that can be linked to gestures with highest level of satisfaction. In order to analyze needs and hurdles of users for the defined gesture, a focus group interview was conducted aiming at the groups of early adopters and ordinary users. Results: As a result, it was found that users have much value regarding usability and fun for elastic device and analysis could be conducted on preferred gesture and its linkable functions. Conclusion: What is most significant with this study is that it sheds new light on the values of a device made of elastic material. Beyond finding and defining the gestures and functions that can be applied to a handheld elastic device, the present study identified the value elements of an elastic device - 'usability and 'fun' -, which users can basically desire from using it. Application: The data that this study brought forth through preference and satisfaction test with the gestures and associated functions will help commercialize an elastic device in future.

A Joystick-driven Mouse Controlling Method using Hand Gestures (손 제스쳐를 이용한 조이스틱 방식의 마우스제어 방법)

  • Jung, Jin-Young;Kim, Jung-In
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.1
    • /
    • pp.60-67
    • /
    • 2016
  • PC users have long been controlling their computers using input devices such as mouse and keyboard. To improve inconveniences of these devices, the method of screen-touching has widely been used these days, and devices recognizing human gestures are being developed one after another. Fox example, Kinect, developed and distributed by Microsoft, is a non-contact input device that recognizes human gestures through motion-recognizing sensors, thus replacing the mouse as an input device. However, when controlling the mouse on a large screen, it suffers from the problem of requiring large motions in order to move the mouse pointer to the edges of the screen. In this paper, we propose a joystick-driven mouse-controlling method which enables the user to move the mouse pointer to the corners of the screen with small motions. The experimental results show that movements of the user's palm within the range of 30 cm ensure movements of the mouse pointer to the edges of the screen.

Dynamic Training Algorithm for Hand Gesture Recognition System (손동작 인식 시스템을 위한 동적 학습 알고리즘)

  • Kim, Moon-Hwan;hwang, suen ki;Bae, Cheol-Soo
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.2
    • /
    • pp.51-56
    • /
    • 2009
  • We developed an augmented new reality tool for vision-based hand gesture recognition in a camera-projector system. Our recognition method uses modified Fourier descriptors for the classification of static hand gestures. Hand segmentation is based on a background subtraction method, which is improved to handle background changes. Most of the recognition methods are trained and tested by the same service-person, and training phase occurs only preceding the interaction. However, there are numerous situations when several untrained users would like to use gestures for the interaction. In our new practical approach the correction of faulty detected gestures is done during the recognition itself. Our main result is the quick on-line adaptation to the gestures of a new user to achieve user-independent gesture recognition.

  • PDF

Dynamic Training Algorithm for Hand Gesture Recognition System (손동작 인식 시스템을 위한 동적 학습 알고리즘)

  • Bae, Cheol-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.7
    • /
    • pp.1348-1353
    • /
    • 2007
  • We developed an augmented new reality tool for vision-based hand gesture recognition in a camera-projector system. Our recognition method uses modified Fourier descriptors for the classification of static hand gestures. Hand segmentation is based on a background subtraction method, which is improved to handle background changes. Most of the recognition methods are trained and tested by the same service-person, and training phase occurs only preceding the interaction. However, there are numerous situations when several untrained users would like to use gestures for the interaction. In our new practical approach the correction of faulty detected gestures is done during the recognition itself. Our main result is the quick on-line adaptation to the gestures of a new user to achieve user-independent gesture recognition.

HAND GESTURE INTERFACE FOR WEARABLE PC

  • Nishihara, Isao;Nakano, Shizuo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.664-667
    • /
    • 2009
  • There is strong demand to create wearable PC systems that can support the user outdoors. When we are outdoors, our movement makes it impossible to use traditional input devices such as keyboards and mice. We propose a hand gesture interface based on image processing to operate wearable PCs. The semi-transparent PC screen is displayed on the head mount display (HMD), and the user makes hand gestures to select icons on the screen. The user's hand is extracted from the images captured by a color camera mounted above the HMD. Since skin color can vary widely due to outdoor lighting effects, a key problem is accurately discrimination the hand from the background. The proposed method does not assume any fixed skin color space. First, the image is divided into blocks and blocks with similar average color are linked. Contiguous regions are then subjected to hand recognition. Blocks on the edges of the hand region are subdivided for more accurate finger discrimination. A change in hand shape is recognized as hand movement. Our current input interface associates a hand grasp with a mouse click. Tests on a prototype system confirm that the proposed method recognizes hand gestures accurately at high speed. We intend to develop a wider range of recognizable gestures.

  • PDF

Hand Gesture Segmentation Method using a Wrist-Worn Wearable Device

  • Lee, Dong-Woo;Son, Yong-Ki;Kim, Bae-Sun;Kim, Minkyu;Jeong, Hyun-Tae;Cho, Il-Yeon
    • Journal of the Ergonomics Society of Korea
    • /
    • v.34 no.5
    • /
    • pp.541-548
    • /
    • 2015
  • Objective: We introduce a hand gesture segmentation method using a wrist-worn wearable device which can recognize simple gestures of clenching and unclenching ones' fist. Background: There are many types of smart watches and fitness bands in the markets. And most of them already adopt a gesture interaction to provide ease of use. However, there are many cases in which the malfunction is difficult to distinguish between the user's gesture commands and user's daily life motion. It is needed to develop a simple and clear gesture segmentation method to improve the gesture interaction performance. Method: At first, we defined the gestures of making a fist (start of gesture command) and opening one's fist (end of gesture command) as segmentation gestures to distinguish a gesture. The gestures of clenching and unclenching one's fist are simple and intuitive. And we also designed a single gesture consisting of a set of making a fist, a command gesture, and opening one's fist in order. To detect segmentation gestures at the bottom of the wrist, we used a wrist strap on which an array of infrared sensors (emitters and receivers) were mounted. When a user takes gestures of making a fist and opening one's a fist, this changes the shape of the bottom of the wrist, and simultaneously changes the reflected amount of the infrared light detected by the receiver sensor. Results: An experiment was conducted in order to evaluate gesture segmentation performance. 12 participants took part in the experiment: 10 males, and 2 females with an average age of 38. The recognition rates of the segmentation gestures, clenching and unclenching one's fist, are 99.58% and 100%, respectively. Conclusion: Through the experiment, we have evaluated gesture segmentation performance and its usability. The experimental results show a potential for our suggested segmentation method in the future. Application: The results of this study can be used to develop guidelines to prevent injury in auto workers at mission assembly plants.