• Title/Summary/Keyword: 시선기반 사용자 인터페이스

Search Result 15, Processing Time 0.033 seconds

A Study on Interaction of Gaze-based User Interface in Mobile Virtual Reality Environment (모바일 가상현실 환경에서의 시선기반 사용자 인터페이스 상호 작용에 관한 연구)

  • Kim, Mingyu;Lee, Jiwon;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.23 no.3
    • /
    • pp.39-46
    • /
    • 2017
  • This study proposes a gaze-based user interface to provide user oriented interaction suitable for the virtual reality environment on mobile platforms. For this purpose, a mobile platform-based three-dimensional interactive content is produced, to test whether the proposed interface increases user satisfaction through the interactions in a mobile virtual reality environment. The gaze-based interface, the most common input method for mobile virtual reality content, is designed by considering two factors: the field of view and the feedback system. The performance of the proposed gaze-based interface is analyzed by conducting experiments on whether or not it offers motives for user interest, effects of enhanced immersion, differentiated formats from existing ones, and convenience in operating content.

Development of Integrated Analysis Model for Eyegaze Analysis - With Emphasis on the Generation of Heuristic Guidelines for User Interface Design - (시선추적 분석을 위한 통합 해석 모델의 개발 - 사용자 인터페이스 디자인을 위한 휴리스틱 가이드라인의 도출을 중심으로 -)

  • 성기원;이건표
    • Archives of design research
    • /
    • v.17 no.2
    • /
    • pp.23-32
    • /
    • 2004
  • This paper's objective is the analysis of eye-movement recording with visual perception process, the inference of heuristic guidelines with human information processing, and the generation of design principles for practical works. For this objective, it was experimented on that the user's eye-movement recording of interactive media with the Eyegaze Interface System, and analyzed the visual perception process of top-down & bottom-up processing, and inferred the design principles from human information process. Our results provide the implications of design through the analysis of the user's eye-movement recording that were changed according to each menu depth of the interactive media. And, it is proposed that the new concept of heuristic guidelines based on each stage of action that is related to human factors.

  • PDF

A Gesture-based Control Interface Design for Handheld Game Consoles Using Accelerometer (가속도 센서를 이용한 동작 기반 휴대용 게임기 조작 인터페이스 디자인)

  • Go, Geon-Hyeok;Bang, Mi-Hyang;Seo, Jae-Woo;Cho, Sun-Young
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02b
    • /
    • pp.381-386
    • /
    • 2007
  • 휴대용 게임기의 조작 인터페이스는 처음 등장한 이래로 지금까지 별다른 변화가 없었다. 이렇게 고정된 조작 인터페이스는 는 휴대용 게임기를 통한 사용자의 조작 경험을 제한시켰다. 우리는 이러한 전형적인 조작 방식에서 벗어나 보다 다채로운 상호작용을 할 수 있는 휴대용 게임기의 조작 인터페이스를 디자인했다. 사례 조사와 사용자 조사를 통해 동작 인식 인터페이스를 휴대용 게임기에 도입할 경우 발생할 수 있는 문제점으로는 크게 1. 화면의 가시성 2. 게임 요소의 동작 지각 3. 낯선 인터페이스에 대한 거부감 4. 주위의 시선이 신경 쓰임 5. 무게 등이 제시되었다. 이에 우리는 본체로부터 조작부와 화면부를 분리함으로써 이러한 문제를 해결할 수 있는 디자인을 제안했다. 사용자는 조작 장치를 본체로부터 분리하여 한 손에 들고 쥔 채 상하좌우로 움직이면, 조작 장치에 내장된 가속도 센서가 움직이는 방향과 속도를 인식, 본체의 처리 장치로 전달한다. 사용자는 나머지 한 손에 본체를 들고 화면을 보면서 게임을 사용할 수 있다. 프로토타입을 통해 사용자 시험을 실시한 결과 예상했던 문제점을 해결할 수 있는 가능성을 볼 수 있었다.

  • PDF

Technology Development for Non-Contact Interface of Multi-Region Classifier based on Context-Aware (상황 인식 기반 다중 영역 분류기 비접촉 인터페이스기술 개발)

  • Jin, Songguo;Rhee, Phill-Kyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.6
    • /
    • pp.175-182
    • /
    • 2020
  • The non-contact eye tracking is a nonintrusive human-computer interface providing hands-free communications for people with severe disabilities. Recently. it is expected to do an important role in non-contact systems due to the recent coronavirus COVID-19, etc. This paper proposes a novel approach for an eye mouse using an eye tracking method based on a context-aware based AdaBoost multi-region classifier and ASSL algorithm. The conventional AdaBoost algorithm, however, cannot provide sufficiently reliable performance in face tracking for eye cursor pointing estimation, because it cannot take advantage of the spatial context relations among facial features. Therefore, we propose the eye-region context based AdaBoost multiple classifier for the efficient non-contact gaze tracking and mouse implementation. The proposed method detects, tracks, and aggregates various eye features to evaluate the gaze and adjusts active and semi-supervised learning based on the on-screen cursor. The proposed system has been successfully employed in eye location, and it can also be used to detect and track eye features. This system controls the computer cursor along the user's gaze and it was postprocessing by applying Gaussian modeling to prevent shaking during the real-time tracking using Kalman filter. In this system, target objects were randomly generated and the eye tracking performance was analyzed according to the Fits law in real time. It is expected that the utilization of non-contact interfaces.

Adaptive Zoom-based Gaze Tracking for Enhanced Accuracy and Precision (정확도 및 정밀도 향상을 위한 적응형 확대 기반의 시선 추적 기법)

  • Song, Hyunjoo;Jo, Jaemin;Kim, Bohyoung;Seo, Jinwook
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.9
    • /
    • pp.610-615
    • /
    • 2015
  • The accuracy and precision of video-based remote gaze trackers is affected by numerous factors (e.g. the head movement of the participant). However, it is challenging to control all factors that have an influence, and doing so (e.g., using a chin-rest to control geometry) could lead to losing the benefit of using gaze trackers, i.e., the ecological validity of their unobtrusive nature. We propose an adaptive zoom-based gaze tracking technique, ZoomTrack that addresses this problem by improving the resolution of the gaze tracking results. Our approach magnifies a region-of-interest (ROI) and retrieves gaze points at a higher resolution under two different zooming modes: only when the gaze reaches the ROI (temporary) or whenever a participant stares at the stimuli (omnipresent). We compared these against the base case without magnification in a user study. The results are then used to summarize the advantages and limitations of our technique.

A Study on Gamepad/Gaze based Input Processing for Mobile Platform Virtual Reality Contents (모바일 플랫폼 가상현실 콘텐츠에 적합한 게임패드/시선 기반 입력 처리 기술에 관한 연구)

  • Lee, Jiwon;Kim, Mingyu;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.22 no.3
    • /
    • pp.31-41
    • /
    • 2016
  • This study proposes suitable input processing technique for producing the mobile platform based virtual reality contents. First of all, we produce the mobile platform based virtual reality interactive contents to be used in experiments for improve user's immersion who experience the virtual reality contents, get interests and design the input processing that easily controllable. Then design the input processing technique in two methods, with gamepad that accessibility to mobile and with directly through user's gaze to interface. Through virtual reality based input processing technique that we proposed, we analyse effect of improve user's immersion, cause of get interests and whether provide the convenience or not for controlling contents through experiments. Moreover we verify whether bring negative psychological elements like sickness, fatigue or not.

MyWorkspace: VR Platform with an Immersive User Interface (MyWorkspace: 몰입형 사용자 인터페이스를 이용한 가상현실 플랫폼)

  • Yoon, Jong-Won;Hong, Jin-Hyuk;Cho, Sung-Bae
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.52-55
    • /
    • 2009
  • With the recent development of virtual reality, it has been actively investigated to develop user interfaces for immersive interaction. Immersive user interfaces improve the efficiency and the capability of information processing in the virtual environment providing various services, and provide effective interaction in the field of ubiquitous and mobile computing. In this paper, we propose an virtual reality platform "My Workspace" which renders an 3D virtual workspace by using an immersive user interface. We develop an interface that integrates an optical see-through head-mounted display, a Wii remote controller, and a helmet with infrared LEDs. It estimates the user's gaze direction in terms of horizontal and vertical angles based on the model of head movements. My Workspace expands the current 2D workspace based on monitors into the layered 3D workspace, and renders a part of 3D virtual workspace corresponding to the gaze direction. The user can arrange various tasks on the virtual workspace and switch each task by moving his head. In this paper, we will also verify the performance of the immersive user interface as well as its usefulness with the usability test.

  • PDF

Usability Test by Integrated Analysis Model - With Emphasis on Eyegaze Analysis of Mobile Interface Design (통합 해석 모델을 활용한 사용성 평가 -모바일 인터페이스 디자인의 시선추적 분석을 중심으로-)

  • 성기원;이건표
    • Archives of design research
    • /
    • v.17 no.3
    • /
    • pp.245-254
    • /
    • 2004
  • In Accordance with the change of design paradigm, the design process has changed into user-centered workflow from designer-centered workflow. Since the purpose of the past research methods is quantitative analysis or the understanding of the present situation, it doesn't fit in practical design that expressed the user's needs. Therefore, the real data about what they see and how they feel will be useful for the user-centered design. This paper's objective is to analyze eye-movement recordings and pupil size of user for mobile interface design. For this objective, it was experimented on that the user's eyegaze data of using a mobile phone by the Eyegaze Interface System, and analyzed three levels of user's task performance. The results provided evaluation of new developed and old existing interface design of mobile phone by the experiment of eye-movement recordings and pupil size. The benefit of results is compliment of the limitation of current usability test through visual characteristics of design and qualitative data of user.

  • PDF

Gaze Matching Based on Multi-microphone for Remote Tele-conference (멀티 마이크로폰 기반 원격지 간 화상회의 시선 일치 기법)

  • Lee, Daeseong;Jo, Dongsik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.429-431
    • /
    • 2021
  • Recently, as an alternative to replace face-to-face meetings, video conferencing systems between remote locations has increased. However, video conferencing systems have limitations in terms of mismatch of the eyes of remote users. Therefore, it is necessary to apply a technology that can increase the level of immersion in video conferences by matching the gaze information of participants between different remote locations. In this paper, we propose a novel technique to realize video conferencing with the same gaze by estimating the speaker's location based on a multi-microphone. Using our method, it can be applied to various fields such as robot interaction and virtual human interface as well as video conferencing between remote locations.

  • PDF

Design of dataglove based multimodal interface for 3D object manipulation in virtual environment (3 차원 오브젝트 직접조작을 위한 데이터 글러브 기반의 멀티모달 인터페이스 설계)

  • Lim, Mi-Jung;Park, Peom
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1011-1018
    • /
    • 2006
  • 멀티모달 인터페이스는 인간의 제스처, 시선, 손의 움직임, 행동의 패턴, 음성, 물리적인 위치 등 인간의 자연스러운 행동들에 대한 정보를 해석하고 부호화하는 인지기반 기술이다. 본 논문에서는 제스처와 음성, 터치를 이용한 3D 오브젝트 기반의 멀티모달 인터페이스를 설계, 구현한다. 서비스 도메인은 스마트 홈이며 사용자는 3D 오브젝트 직접조작을 통해 원격으로 가정의 오브젝트들을 모니터링하고 제어할 수 있다. 멀티모달 인터랙션 입출력 과정에서는 여러 개의 모달리티를 병렬적으로 인지하고 처리해야 하기 때문에 입출력 과정에서 각 모달리티의 조합과 부호화 방법, 입출력 형식 등이 문제시된다. 본 연구에서는 모달리티들의 특징과 인간의 인지구조 분석을 바탕으로 제스처, 음성, 터치 모달리티 간의 입력조합방식을 제시하고 멀티모달을 이용한 효율적인 3D Object 인터랙션 프로토타입을 설계한다.

  • PDF