• Title/Summary/Keyword: 시선통신

Search Result 109, Processing Time 0.028 seconds

보는 대로 바로 연결하는 시선(視選)통신 기술

  • Kim, Seon-Ae;Kim, Yeong-Hun;Kim, Su-Chang;Lee, Mun-Sik;Bang, Seung-Chan
    • Information and Communications Magazine
    • /
    • v.31 no.1
    • /
    • pp.89-93
    • /
    • 2013
  • 본 고에서는 수많은 기기들이 산재하는 무선 통신환경에서 근접거리 내의 디바이스 및 서비스 발견 과정 시에 사용자 개입을 최소화하여 빠르고 편리하게 통신대상에 연결하기 위한 시선(視選)통신 기술에 대하여 알아본다. 시선통신(Look-and-Link) 은 사용자가 통신대상의 식별자(전화번호, MAC address 등)를 모르더라도 스마트폰의 화면에서 대상을 보고 선택하면 바로 연결시켜주는 기술이다. 사용자는 스마트폰의 화면을 통해 통신대상을 선택하는 최소한의 관여만으로도 사용자가 원하는 통신대상(프린터, TV, 상점, 사람)을 사용자의 기기가 자동으로 통신대상을 인식하고 해당 디바이스의 통신식별자를 획득해서 손쉽고 빠르게 통신대상과 접속 할 수 있다. 시선통신 기술은 무선 트래픽이 폭증하는 최근 통신환경에서 별도의 네트워크 도움 없이 주변 단말과 직접통신을 통해 사용자에게 근접인식 기반의 다양한 모바일 서비스를 제공한다.

A Gaze Detection Technique Using a Monocular Camera System (단안 카메라 환경에서의 시선 위치 추적)

  • 박강령;김재희
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.10B
    • /
    • pp.1390-1398
    • /
    • 2001
  • 시선 위치 추적이란 사용자가 모니터 상의 어느 지점을 쳐다보고 있는 지를 파악해 내는 기술이다. 시선 위치를 파악하기 위해 본 논문에서는 2차원 카메라 영상으로부터 얼굴 영역 및 얼굴 특징점을 추출한다. 초기에 모니터상의 3 지점을 쳐다볼 때 얼굴 특징점들은 움직임의 변화를 나타내며, 이로부터 카메라 보정 및 매개변수 추정 방법을 이용하여 얼굴특징점의 3차원 위치를 추정한다. 이후 사용자가 모니터 상의 또 다른 지점을 쳐다볼 때 얼굴 특징점의 변화된 3차원 위치는 3차원 움직임 추정방법 및 아핀변환을 이용하여 구해낸다. 이로부터 변화된 얼굴 특징점 및 이러한 얼굴 특징점으로 구성된 얼굴평면이 구해지며, 이러한 평면의 법선으로부터 모니터 상의 시선위치를 구할 수 있다. 실험 결과 19인치 모니터를 사용하여 모니터와 사용자까지의 거리를 50∼70cm정도 유지하였을 때 약 2.08인치의 시선위치에러 성능을 얻었다. 이 결과는 Rikert의 논문에서 나타낸 시선위치추적 성능(5.08cm 에러)과 비슷한 결과를 나타낸다. 그러나 Rikert의 방법은 모니터와 사용자 얼굴까지의 거리는 항상 고정시켜야 한다는 단점이 있으며, 얼굴의 자연스러운 움직임(회전 및 이동)이 발생하는 경우 시선위치추적 에러가 증가되는 문제점이 있다. 동시에 그들의 방법은 사용자 얼굴의 뒤 배경에 복잡한 물체가 없는 것으로 제한조건을 두고 있으며 처리 시간이 상당히 오래 걸리는 문제점이 있다. 그러나 본 논문에서 제안하는 시선 위치 추적 방법은 배경이 복잡한 사무실 환경에서도 사용가능하며, 약 3초 이내의 처리 시간(200MHz Pentium PC)이 소요됨을 알 수 있었다.

  • PDF

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Hwang, suen ki;Kim, Moon-Hwan;Cha, Sam;Cho, Eun-Seuk;Bae, Cheol-Soo
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.3
    • /
    • pp.61-69
    • /
    • 2009
  • In this paper, to propose a new approach to real-time eye tracking. Existing methods of tracking the user's attention to the little I move my head was not going to get bad results for each of the users needed to perform the calibration process. Infrared eye tracking methods proposed lighting and Generalized Regression Neural Networks (GRNN) By using the calibration process, the movement of the head is large, even without the reliable and accurate eye tracking, mapping function was to enable each of the calibration process by the generalization can be omitted, did not participate in the study eye other users tracking was possible. Experimental results of facial movements that an average 90% of cases, other users on average 85% of the eye tracking results were shown.

  • PDF

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • 이영식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

Kiosk System Development Using Eye Tracking And Face-Recognition Technology (시선추적 기술과 얼굴인식 기술을 이용한 무인단말기(키오스크)시스템)

  • Kim, Min-Jae;Kim, Tae-Won;Lee, Hyo-Jin;Jo, Il-Hyun;Kim, Woongsup
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2020.11a
    • /
    • pp.486-489
    • /
    • 2020
  • 본 설계는 얼굴과 눈을 인식한 후, 시선추적을 통해 마우스와 눈동자의 움직임을 연결하여 메뉴를 주문하는 기술이다. 시선추적을 통해 키오스크를 터치하지 않아도 메뉴를 간편하게 주문할 수 있고, 얼굴인식을 이용해 자신의 최근 주문기록을 확인하여 쉽고 빠르게 메뉴를 주문할 수 있다. 얼굴이 등록되어있지 않은 새로운 사용자는 안드로이드 앱을 이용하여 사진과 메뉴를 선택하여 장바구니에 담아 주문 시간을 단축할 수 있어 바쁜 현대인들에게 편리함을 제공할 수 있도록 구현하였다.

Implementation of VLC Relay Module to Improve Communication Disconnection Phenomenon and Coverage Expansion in Non-Line-of-Sight Area (비가시선 영역의 통신 단절 현상 개선과 커버리지 확장을 위한 VLC 릴레이 모듈 구현)

  • Lee, Sang-gwon;Lee, Dae-hee;Oh, Chang-heon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.1
    • /
    • pp.140-146
    • /
    • 2018
  • Recently, VLC(Visible Light Communication) fusing with LED(Light Emitting Diode) used in indoor lighting and wireless communication technology has attracted attention. However, the VLC can communicate only within the coverage to measure the optical signal and the communication disconnection phenomenon is occur in the NLoS(Non-Line-of-Sight) area. In this paper, we propose VLC relay module to extend the coverage of VLC and improve the communication disconnection phenomenon in NLoS area. The proposed VLC relay module transmits the packet received from the transmitter to the VLC relay module and receiver. The experiment was carried out by installing one VLC transmitter and three VLC relay modules, communication coverage expansion and the improvement of communication disconnection phenomenon in the NLoS were confirmed by the increase of the VLC relay module. Also, we confirmed that the optical signal measurement performance is improved 2.4 times by using the dual sampling method.

Real Time Gaze Discrimination for Human Computer Interaction (휴먼 컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Park Ho sik;Bae Cheol soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.125-132
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5°horizontally and 8°vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Real Time Gaze Discrimination for Computer Interface (컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Hwang, Suen-Ki;Kim, Moon-Hwan
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.3 no.1
    • /
    • pp.38-46
    • /
    • 2010
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about $5^{\circ}$horizontally and $8^{\circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

  • PDF

A Study on Real Time Gaze Discrimination System using GRNN (GRNN을 이용한 실시간 시선 식별 시스템에 관한 연구)

  • Lee Young-Sik;Bae Cheol-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.2
    • /
    • pp.322-329
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNS). With GRNNS, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10$\%$ improvement in classification error. The angular gaze accuracy is about $5^{circ}$horizontally and $8^{circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Gaze Detection System by IR-LED based Camera (적외선 조명 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.4C
    • /
    • pp.494-504
    • /
    • 2004
  • The researches about gaze detection have been much developed with many applications. Most previous researches only rely on image processing algorithm, so they take much processing time and have many constraints. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.2 cm of RMS error.