• Title/Summary/Keyword: implicit intent recognition

Search Result 2, Processing Time 0.017 seconds

Discriminant Analysis of Human's Implicit Intent based on Eyeball Movement (안구운동 기반의 사용자 묵시적 의도 판별 분석 모델)

  • Jang, Young-Min;Mallipeddi, Rammohan;Kim, Cheol-Su;Lee, Minho
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.212-220
    • /
    • 2013
  • Recently, there has been tremendous increase in human-computer/machine interaction system, where the goal is to provide with an appropriate service to the user at the right time with minimal human inputs for human augmented cognition system. To develop an efficient human augmented cognition system based on human computer/machine interaction, it is important to interpret the user's implicit intention, which is vague, in addition to the explicit intention. According to cognitive visual-motor theory, human eye movements and pupillary responses are rich sources of information about human intention and behavior. In this paper, we propose a novel approach for the identification of human implicit visual search intention based on eye movement pattern and pupillary analysis such as pupil size, gradient of pupil size variation, fixation length/count for the area of interest. The proposed model identifies the human's implicit intention into three types such as navigational intent generation, informational intent generation, and informational intent disappearance. Navigational intent refers to the search to find something interesting in an input scene with no specific instructions, while informational intent refers to the search to find a particular target object at a specific location in the input scene. In the present study, based on the human eye movement pattern and pupillary analysis, we used a hierarchical support vector machine which can detect the transitions between the different implicit intents - navigational intent generation to informational intent generation and informational intent disappearance.

Development of an EMG-Based Car Interface Using Artificial Neural Networks for the Physically Handicapped (신경망을 적용한 지체장애인을 위한 근전도 기반의 자동차 인터페이스 개발)

  • Kwak, Jae-Kyung;Jeon, Tae-Woong;Park, Hum-Yong;Kim, Sung-Jin;An, Kwang-Dek
    • Journal of Information Technology Services
    • /
    • v.7 no.2
    • /
    • pp.149-164
    • /
    • 2008
  • As the computing landscape is shifting to ubiquitous computing environments, there is increasingly growing the demand for a variety of device controls that react to user's implicit activities without excessively drawing user attentions. We developed an EMG-based car interface that enables the physically handicapped to drive a car using their functioning peripheral nerves. Our method extracts electromyogram signals caused by wrist movements from four places in the user's forearm and then infers the user's intent from the signals using multi-layered neural nets. By doing so, it makes it possible for the user to control the operation of car equipments and thus to drive the car. It also allows the user to enter inputs into the embedded computer through a user interface like an instrument LCD panel. We validated the effectiveness of our method through experimental use in a car built with the EMG-based interface.