• Title/Summary/Keyword: Wearable Interface

Search Result 114, Processing Time 0.026 seconds

Wearable User Interface based on EOG and Marker Recognition (EOG와 마커인식을 이용한 착용형 사용자 인터페이스)

  • Kang, Sun-Kyoung;Jung, Sung-Tae;Lee, Sang-Seol
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.6 s.44
    • /
    • pp.133-141
    • /
    • 2006
  • Recently many wearable computers have been developed. But they still have many user interface problems from both an input and output perspective. This paper presents a wearable user interface based on EOG(electrooculogram) sensing circuit and marker recognition. In the proposed user interface, the EOG sensor circuit which tracks the movement of eyes by sensing the potential difference across the eye is used as a pointing device. Objects to manipulate are represented human readable markers. And the marker recognition system detects and recognize markers from the camera input image. When a marker is recognized, the corresponding property window and method window are displayed to the head mounted display. Users manipulate the object by selecting a property or a method item from the window. By using the EOG sensor circuit and the marker recognition system, we can manipulate an object with only eye movement in the wearable computing environment.

  • PDF

Development of Interface device with EOG Signal (EOG(Electro-oculogram) 신호를 이용한 Interface 장치 개발)

  • Kim, Su-Jong;Ryu, Ho-Sun;Kim, Young-Chol
    • Proceedings of the KIEE Conference
    • /
    • 2006.07d
    • /
    • pp.1821-1823
    • /
    • 2006
  • This paper presents a development of interface device for electro-oculogram(EOG) signal and it's application to the wireless mouse of wearable PC. The interface device is composed of five bio-electrodes for detecting oculomotor motion, several band-pass filters, instrumentation amplifier and a microprocessor. we have first analyzed impedance characteristics between skin and a bio-electrode. since the impedance highly depends on human face, it's magnitude differs from person. this interface device was applied to develop a wireless mouse for wearable PC, as a Bio Machine Interface(BMI). Where in the prompt on PC monitor is controlled by only EOG signals. this system was implemented in a Head Mount Display(HMD) unit. experimental results show the accuracy of above 90%.

  • PDF

Interaction and Interface Design of Smart Watches (스마트워치 인터렉션 및 인터페이스 디자인)

  • Lim, Da-Eun;Wang, Lin
    • The Journal of the Korea Contents Association
    • /
    • v.15 no.3
    • /
    • pp.11-20
    • /
    • 2015
  • Recently, with the market of smart phones slowing down, more attention is paid to wearable devices. It was suggested that 2013 was the "year of the smart watch", due to that the majority of major consumer electronics manufacturers were undertaking work on a smart watch device. However there are still many issues on the interaction and interface design of smart watches. This study reviewed related literatures and evaluated the interaction methods, interface and content design of current smart watches. Based on that, future research directions were proposed. The research results have significant meanings for the guidance of future directions of smart watch.

Design of Multimodal User Interface using Speech and Gesture Recognition for Wearable Watch Platform (착용형 단말에서의 음성 인식과 제스처 인식을 융합한 멀티 모달 사용자 인터페이스 설계)

  • Seong, Ki Eun;Park, Yu Jin;Kang, Soon Ju
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.6
    • /
    • pp.418-423
    • /
    • 2015
  • As the development of technology advances at exceptional speed, the functions of wearable devices become more diverse and complicated, and many users find some of the functions difficult to use. In this paper, the main aim is to provide the user with an interface that is more friendly and easier to use. The speech recognition is easy to use and also easy to insert an input order. However, speech recognition is problematic when using on a wearable device that has limited computing power and battery. The wearable device cannot predict when the user will give an order through speech recognition. This means that while speech recognition must always be activated, because of the battery issue, the time taken waiting for the user to give an order is impractical. In order to solve this problem, we use gesture recognition. This paper describes how to use both speech and gesture recognition as a multimodal interface to increase the user's comfort.

User interface of Home-Automation for the physically handicapped Person in wearable computing environment (웨어러블 환경에서의 수족사용 불능자를 위한 홈오토메이션 사용자 인터페이스)

  • Kang, Sun-Kyung;Kim, Young-Un;Han, Dae-Kyung;Jung, Sung-Tae
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.5
    • /
    • pp.187-193
    • /
    • 2008
  • Interface technologies for a user to control home automation system in wearable computing environment has been studied recently. This paper proposes a new interface method for a disabled person to control home automation system in wearable computing environment by using EOG sensing circuit and marker recognition. In the proposed interface method, the operations of a home network device are represented with human readable markers and displayed around the device. A user wearing a HMD, a video camera, and a computer selects the desired operation by seeing the markers and selecting one of them with eye movement from the HMD display The requested operation is executed by sending the control command for the selected marker to the home network control device. By using the EOG sensing circuit and the marker recognition system a user having problem with moving hands and fit can manipulate a home automation system with only eye movement.

  • PDF

A Light-weight ANN-based Hand Motion Recognition Using a Wearable Sensor (웨어러블 센서를 활용한 경량 인공신경망 기반 손동작 인식기술)

  • Lee, Hyung Gyu
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.17 no.4
    • /
    • pp.229-237
    • /
    • 2022
  • Motion recognition is very useful for implementing an intuitive HMI (Human-Machine Interface). In particular, hands are the body parts that can move most precisely with relatively small portion of energy. Thus hand motion has been used as an efficient communication interface with other persons or machines. In this paper, we design and implement a light-weight ANN (Artificial Neural Network)-based hand motion recognition using a state-of-the-art flex sensor. The proposed design consists of data collection from a wearable flex sensor, preprocessing filters, and a light-weight NN (Neural Network) classifier. For verifying the performance and functionality of the proposed design, we implement it on a low-end embedded device. Finally, our experiments and prototype implementation demonstrate that the accuracy of the proposed hand motion recognition achieves up to 98.7%.

Glanceable and Informative WearOS User Interface for Kids and Parents

  • Kim, Siyeon;Yoon, Hyoseok
    • Journal of Multimedia Information System
    • /
    • v.8 no.1
    • /
    • pp.17-22
    • /
    • 2021
  • This paper proposes a wearable user interface intended for kids and parents using WearOS smartwatches. We first review what constitutes a kids smartwatch and then design UI components for watchfaces to be used by kids and parents. Different UI components ranging from activity, education, voice search, app usage, video, location, health, and quick dial are described. These components are either implemented as complications or on watchfaces and may require on-device standalone function, cross-device communication, and external database. We introduce a theme-based amusing UI for kids whereas simple and easily accessible components are recommended to parents' watchface. To illustrate use cases, we present 3 scenarios for enhancing communication between parents and child. To show feasibility and potential of our approach, we implement our proof-of-concept using commercial smartwatches, smartphones, and external cloud database. Furthermore, performance of checking app usages on different devices are presented, followed by discussion on limitations and future work.

Development of Wearable Vibrotactile Display Device (착용 가능한 진동촉감 제시 장치 개발)

  • Seo, Chang-Hoon;Kim, Hyun-Ho;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1-6
    • /
    • 2006
  • 촉감 제시 방법은 다른 사람에게 방해를 주지 않고 은밀하게 정보를 전달할 수 있는 장점이 있으며, 특히 시각 혹은 청각 장애인에게는 반드시 필요한 정보 전달의 수단이다. 또한 촉감을 이용한 정보의 전달은 시청각을 이용한 정보전달의 방법을 보완하거나 때로는 대체할 수도 있다. 본 논문에서는 웨어러블, 모바일, 또는 유비쿼터스 컴퓨팅 환경에서 사용할 수 있는 착용 가능한 진동촉감 제시 장치를 제안한다. 이 진동촉감 제시 장치는 25개의 진동모터를 $5{\times}5$의 형태로 배열하여 문자, 숫자뿐만 아니라 다양하고 복잡한 패턴을 표시할 수 있다. 코인형 진동모터 각각을 스펀지로 감싸고 푹신푹신한 재질의 패드에 세워서 배열하여 진동의 퍼짐을 최소화하고 사람의 글씨 쓰는 순서에 따라 진동모터를 순차적으로 구동시키는 새로운 추적모드를 제안하여 사용자의 문자 및 숫자 인식률을 크게 향상시켰다. 사용자 성능 평가에서는 사용자의 발등에 영문 알파벳을 표시하여 86.7%의 인식률을 얻었다. 또한 진동촉감 제시 장치를 이용하여 핸드폰에서의 발신자 정보표시를 한다거나 네비게이션 시스템에 적용할 수 있는 등의 유용한 응용분야를 제시하였다.

  • PDF