• Title/Summary/Keyword: Wearable Interface

Search Result 114, Processing Time 0.024 seconds

Technology Requirements for Wearable User Interface

  • Cho, Il-Yeon
    • Journal of the Ergonomics Society of Korea
    • /
    • v.34 no.5
    • /
    • pp.531-540
    • /
    • 2015
  • Objective: The objective of this research is to investigate the fundamentals of human computer interaction for wearable computers and derive technology requirements. Background: A wearable computer can be worn anytime with the support of unrestricted communications and a variety of services which provide maximum capability of information use. Key challenges in developing such wearable computers are the level of comfort that users do not feel what they wear, and easy and intuitive user interface. The research presented in this paper examines user interfaces for wearable computers. Method: In this research, we have classified the wearable user interface technologies and analyzed the advantages and disadvantages from the user's point of view. Based on this analysis, we issued a user interface technology to conduct research and development for commercialization. Results: Technology requirements are drawn to make wearable computers commercialized. Conclusion: The user interface technology for wearable system must start from the understanding of the ergonomic aspects of the end user, because users wear the system on their body. Developers do not try to develop a state-of-the-art technology without the requirement analysis of the end users. If people do not use the technology, it can't survive in the market. Currently, there is no dominant wearable user interface in the world. So, this area might try a new challenge for the technology beyond the traditional interface paradigm through various approaches and attempts. Application: The findings in this study are expected to be used for designing user interface for wearable systems, such as digital clothes and fashion apparel.

Teleloperation of Field Mobile Manipulator with Wearable Haptic-based Multi-Modal User Interface and Its Application to Explosive Ordnance Disposal

  • Ryu Dongseok;Hwang Chang-Soon;Kang Sungchul;Kim Munsang;Song Jae-Bok
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.10
    • /
    • pp.1864-1874
    • /
    • 2005
  • This paper describes a wearable multi-modal user interface design and its implementation for a teleoperated field robot system. Recently some teleoperated field robots are employed for hazard environment applications (e.g. rescue, explosive ordnance disposal, security). To complete these missions in outdoor environment, the robot system must have appropriate functions, accuracy and reliability. However, the more functions it has, the more difficulties occur in operation of the functions. To cope up with this problem, an effective user interface should be developed. Furthermore, the user interface is needed to be wearable for portability and prompt action. This research starts at the question: how to teleoperate the complicated slave robot easily. The main challenge is to make a simple and intuitive user interface with a wearable shape and size. This research provides multi-modalities such as visual, auditory and haptic sense. It enables an operator to control every functions of a field robot more intuitively. As a result, an EOD (explosive ordnance disposal) demonstration is conducted to verify the validity of the proposed wearable multi-modal user interface.

A Wearable Interface for Tendon-driven Robotic Hand Prosthesis (건구동식 로봇 의수용 착용형 인터페이스)

  • Jung, Sung-Yoon;Park, Chan-Young;Bae, Ju-Hawn;Moon, In-Hyuk
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.4
    • /
    • pp.374-380
    • /
    • 2010
  • This paper proposes a wearable interface for a tendon-driven robotic hand prosthesis. The proposed interface is composed of a dataglove to measure finger and wrist joint angle, and a micro-control board with a wireless RF module. The interface is used for posture control of the robotic hand prosthesis. The measured joint angles by the dataglove are transferred to the main controller via the wireless module. The controller works for directly controlling the joint angle of the hand or for recognizing hand postures using a pattern recognition method such as LDA and k-NN. The recognized hand postures in this study are the paper, the rock, the scissors, the precision grasp, and the tip grasp. In experiments, we show the performances of the wearable interface including the pattern recognition method.

Periodic Biometric Information Collection Interface Method for Wearable Vulnerable Users

  • Lee, Taegyu
    • International journal of advanced smart convergence
    • /
    • v.10 no.3
    • /
    • pp.33-40
    • /
    • 2021
  • Recently, wearable computers equipped with various biosensors such as smart watches, smart bands, and smart patches that support daily health management of users as well as patients have been released. Users of wearable computers such as smart watches face various difficulties in performing biometric information processes such as data sensing, collection, transmission, real-time analysis, and feedback in a weak wireless and mobile biometric information service environment. In particular, the biometric information collection interface is an important basic process that determines the quality and performance of the entire biometric information service. So far, research has focused on sensing hardware and logic. This study intensively considers the interface method for effectively sensing and collecting raw biometric information. In particular, the process of collecting biometric information is designed and analyzed from the perspective of periodicity. Therefore, we propose an efficient and stable periodic collection method.

Eye Gaze Interface in Wearable System (웨어러블 시스템에서 눈동자의 움직임을 이용한 인터페이스)

  • 권기문;이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 2003.07e
    • /
    • pp.2124-2127
    • /
    • 2003
  • This paper suggests user interface method with wearable computer by means of detecting gaze under HMD, head mounted display, environment. System is derived as follows; firstly, calibrate a camera in HMD, which determines geometrical relationship between monitor and captured image. Second, detect the center of pupil using ellipse fitting algorithm and represent a gazing position on the computer screen. If user blinks or stares at a certain position for a while, message is sent to wearable computer. Experimental results show ellipse fitting is robust against glint effects, and detecting error was 6.5%, and 4.25% in vertical and horizontal direction, respectively.

  • PDF

WalkON Suit: A Wearable Robot for Complete paraplegics (WalkON Suit: 하지 완전마비 장애인을 위한 웨어러블 로봇)

  • Choi, Jungsu;Na, Byeonghun;Jung, Pyeong-Gook;Rha, Dong-wook;Kong, Kyoungchul
    • The Journal of Korea Robotics Society
    • /
    • v.12 no.2
    • /
    • pp.116-123
    • /
    • 2017
  • Wearable robots are receiving great attention from the public, as well as researchers, because its motivation is to improve the quality of lives of people. Above all, complete paraplegic patients due to spinal cord injury (SCI) might be the most adequate target users of the wearable robots, because they definitely need physical assistance due to the complete loss of muscular strength and sensory functions. Furthermore, the medical care of complete paraplegics by using the wearable robots have significantly reduced the mortality rate and improved the life expectancy. The requirements of the wearable robot for complete paraplegics are actuation torque, locomotion speed, wearing sensation, robust gait stability, safety, and practicality (i.e., size, volume, weight, and energy efficiency). A WalkON Suit is the wearable robot that has satisfied the requirements of the wearable robot for complete paraplegics and participated in the powered exoskeleton race of Cybathlon 2016. In this paper, configuration of the WalkON Suit, human-machine interface, gait pattern, control algorithm, and evaluation results are introduced.

Evaluating Performance of Pointing Interface of Dynamic Gain Control in Wearable Computing Environment (웨어러블 컴퓨터 환경에서 포인팅 인터페이스의 동적 이득 방법의 효율성 평가)

  • Hong, Ji-Young;Chae, Haeng-Suk;Han, Kwang-Hee
    • Journal of the Ergonomics Society of Korea
    • /
    • v.26 no.4
    • /
    • pp.9-16
    • /
    • 2007
  • Input devices of wearable computer are difficult to use, so a lot of alternative pointing devices have been considered in recent years. In order to resolve this problem, this paper proposed a dynamic gain control method which is able to improve the performance of wearable pointing device and showed an experimental result comparing this method with conventional method. Also the effects of methods were compared in terms of device (wearable and desktop). The result of calculating throughputs(index of performance) by Fitts' law showed that the pointing performance in dynamic gain condition was improved 1.4 times more than normal gain.

Development of an EMG-based Wireless and Wearable Computer Interlace (근전도기반의 무선 착용형 컴퓨터 인터페이스 개발)

  • Han, Hyo-Nyoung;Choi, Chang-Mok;Lee, Yun-Joo;Ha, Sung-Do;Kim, Jung
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.240-244
    • /
    • 2008
  • This paper presents an EMG-based wireless and wearable computer interface. The wearable device contains 4 channel EMG sensors and is able to acquire EMG signals using signal processing. Obtained signals are transmitted to a host computer through wireless communication. EMG signals induced by the volitional movements are acquired from four sites in the lower limb to extract a user's intention and six classes of wrist movements are discriminated by employing an artificial neural network (ANN). This interface could provide an aid to the limb disabled to directly access to computers and network environments without conventional computer interface such as a keyboard and a mouse.

  • PDF

HAND GESTURE INTERFACE FOR WEARABLE PC

  • Nishihara, Isao;Nakano, Shizuo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.664-667
    • /
    • 2009
  • There is strong demand to create wearable PC systems that can support the user outdoors. When we are outdoors, our movement makes it impossible to use traditional input devices such as keyboards and mice. We propose a hand gesture interface based on image processing to operate wearable PCs. The semi-transparent PC screen is displayed on the head mount display (HMD), and the user makes hand gestures to select icons on the screen. The user's hand is extracted from the images captured by a color camera mounted above the HMD. Since skin color can vary widely due to outdoor lighting effects, a key problem is accurately discrimination the hand from the background. The proposed method does not assume any fixed skin color space. First, the image is divided into blocks and blocks with similar average color are linked. Contiguous regions are then subjected to hand recognition. Blocks on the edges of the hand region are subdivided for more accurate finger discrimination. A change in hand shape is recognized as hand movement. Our current input interface associates a hand grasp with a mouse click. Tests on a prototype system confirm that the proposed method recognizes hand gestures accurately at high speed. We intend to develop a wider range of recognizable gestures.

  • PDF