• 제목/요약/키워드: Robot Eyes

검색결과 41건 처리시간 0.03초

비전 방식을 이용한 감정인식 로봇 개발 (Development of an Emotion Recognition Robot using a Vision Method)

  • 신영근;박상성;김정년;서광규;장동식
    • 산업공학
    • /
    • 제19권3호
    • /
    • pp.174-180
    • /
    • 2006
  • This paper deals with the robot system of recognizing human's expression from a detected human's face and then showing human's emotion. A face detection method is as follows. First, change RGB color space to CIElab color space. Second, extract skin candidate territory. Third, detect a face through facial geometrical interrelation by face filter. Then, the position of eyes, a nose and a mouth which are used as the preliminary data of expression, he uses eyebrows, eyes and a mouth. In this paper, the change of eyebrows and are sent to a robot through serial communication. Then the robot operates a motor that is installed and shows human's expression. Experimental results on 10 Persons show 78.15% accuracy.

한 쌍의 푸쉬-풀 와이어를 이용한 로봇 안구의 팬-틸트 모션 생성 (Pan-tilt Motion Generation of Robot Eye by Using a Pair of Push-pull Wires)

  • 정찬열;오경균;박신석;김승종
    • 한국소음진동공학회논문집
    • /
    • 제21권1호
    • /
    • pp.3-8
    • /
    • 2011
  • This paper introduces a robot eye module, of which two degree-of-freedom motions, i.e. panning and tilting, are driven by a pair of wires. The main feature of the module is that each wire can generate push-pull motion without buckling. It is thanks to a Teflon tube which guides the path of the moving wire. End points of the tube and wire have pivot elements so that a smooth push-pull motion is produced even when the end point of wire is moved by eye rotation. This mechanism helps the eye module to be very compact. In this paper, the structure of the robot eye module is introduced in detail, and the required motor angles for a certain direction of eye line are investigated analytically and experimentally.

Image processing of artificial life-robot

  • Kubik, Tomasz;Loukianov, Andrey A.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.36.2-36
    • /
    • 2001
  • At present, information processing by computer is greatly concerned in our society. And robots controlled by computer are much introduced in a factory´s production line and so on, robot abilities develop robot obtain good results. And recently, robots greatly take part in not only limited place, for example a factory and so on, but also general a household. Some robots pleased people, others help humans task. Robots are sure to be great useful in nursing that as regarded our society as questionable. In this situation, we request that robots can take vision like human´s eyes ...

  • PDF

로봇디자인에 대한 선호 반응에 영향을 미치는 조형요소의 특성 (Characteristics of Formative Factor Influencing Robot Design's Preference Response)

  • 허성철;정정필
    • 감성과학
    • /
    • 제11권4호
    • /
    • pp.511-520
    • /
    • 2008
  • 본 연구에서는 로봇디자인에 대한 선호 반응의 결과를 바탕으로 로봇 얼굴을 구성하는 조형요소의 조합 관계에 대한 특성을 분석하는 것이 기본적인 목적이다. 아울러 분석 결과로부터 선호도를 향상시키기 위한 디자인가이드의 제시 가능성을 고찰하고자 하였다. 이를 위하여 27가지의 로봇 얼굴 사진을 실험자극으로 선정하고, 선호 반응 및 연상 반응에 관한 실험을 진행하였다. 실험 로봇 얼굴의 형태 보다는 눈의 형태가 선호 반응에 많은 영향을 미치는 등 다양한 특성이 나타났다. 이러한 결과를 바탕으로 로봇 얼굴에 대한 선호반응에 긍정적 영향을 미칠 수 있는 각 조형요소의 특성 도출 및 기본적 디자인가이드라인을 제시할 수 있었다. 구체적으로 먼저, 눈의 형태는 세로의 길이가 가로 길이 보다는 긴(167%) 타원형을 적용하는 것이 필요하다. 그리고 눈 사이의 거리는 얼굴 폭의 35% 정도를 유지해야 한다. 또한 눈의 위치는 얼굴의 중심축으로부터 상향에 배치하여 시각적으로 안정감을 주는 것이 중요하다. 머리 전체의 형태는 원형을 이용한 구 타입이 바람직하다. 머리의 형태와 눈과의 조화는 로봇으로서 기본적으로 갖추어야 할 귀엽고 깜찍한 이미지를 구현하는 것이 필요하다.

  • PDF

외부조명 변화에 강인한 운전자 졸음 감지 시스템 (System for Detecting Driver's Drowsiness Robust Variations of External Illumination)

  • 최원웅;반성범;신주현
    • 한국멀티미디어학회논문지
    • /
    • 제19권6호
    • /
    • pp.1024-1033
    • /
    • 2016
  • In this study, a system is proposed for analyzing whether driver's eyes are open or closed on the basis of images to determine driver's drowsiness. The proposed system converts eye areas detected by a camera to a color space area to effectively detect eyes in a dark situation, for example, tunnels, and a bright situation due to a backlight. In addition, the system used a thickness distribution of a detected eye area as a feature value to analyze whether eyes are open or closed through the Support Vector Machine(SVM), representing 90.09% of accuracy. In the experiment for the images of driver wearing glasses, 83.83% of accuracy was obtained. In addition, in a comparative experiment with the existing PCA method by using Eigen-eye and Pupil Measuring System the detection rate is shown improved. After the experiment, driver's drowsiness was identified accurately by using the method of summing up the state of driver's eyes open and closes over time and the method of detecting driver's eyes that continue to be closed to examine drowsy driving.

히스토그램을 이용한 얼굴 표정 인식 방법 (A Face Expression Recognition Method using Histograms)

  • 허경무
    • 제어로봇시스템학회논문지
    • /
    • 제20권5호
    • /
    • pp.520-525
    • /
    • 2014
  • Generally, feature area detection methods are widely used for face expression recognition by detecting the feature areas of human eyes, eyebrows and mouth. In this paper, we proposed a face expression recognition method using the histograms of the face, eyes and mouth for many applications including robot technology. The experimental results show that the proposed method has a new type of face expression recognition capability compared to conventional methods.

인공근육을 이용한 얼굴로봇 (A Face Robot Actuated with Artiflcial Muscle)

  • 곽종원;지호준;정광목;남재도;전재욱;최혁렬
    • 제어로봇시스템학회논문지
    • /
    • 제10권11호
    • /
    • pp.991-999
    • /
    • 2004
  • Face robots capable of expressing their emotional status, can be adopted as an efficient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with artificial muscle based on dielectric elastomer. By exploiting the properties of polymers, it is possible to actuate the covering skin, eyes as well as provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven types of actuator modules such as eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is sufficient to generate six fundamental facial expressions such as surprise, fear, anger, disgust, sadness, and happiness. Each module communicates with the others via CAN communication protocol fur the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

양안 시공간을 이용한 Linear Visual Feedback Control (Linear Visual Feedback Conrtol using Binocular Visual Space)

  • 임승우;박창균
    • 한국음향학회지
    • /
    • 제14권6호
    • /
    • pp.74-79
    • /
    • 1995
  • 본 논문에서는 사람의 눈과 팔의 구조를 모방한 우엽시 LVFC-로봇 시스템을 구성하였다. 시공간과 관절공간 사이의 선형근사식을 최소 자승법에 의해 유도하여 양안시 LVFC-로봇 위치 제어에 응용하고, 시뮬레이션을 통해 그 타당성을 확인하였다. 논자가 제안한 양안시 LVFC-로봇은 특징점을 이용한 기존의 양안시 VFC-로봇과 비교할 때 이미지 자코비안과 로봇 자코비안의 계산이 생략되므로 실시간 제어가 가능하였다.

  • PDF

로봇디자인에 대한 선호 반응에 영향을 미치는 조형요소의 특성 (The Property of Formative Factor Influencing Preference on Robot's Design)

  • 정정필;허성철
    • 한국감성과학회:학술대회논문집
    • /
    • 한국감성과학회 2008년도 추계학술대회
    • /
    • pp.38-41
    • /
    • 2008
  • This study's basic intention is to analyze property of combination relations of formative element composing robot' s face based on a result of preference response on robot's design. Also, in order to improve preference from the analysis result, the study intended to inquire into possibilities of suggesting design guideline. For the above, photographs of 27 robots' faces were selected as a experimental stimuli, and experiments on preference response and association response were performed. As a result, various properties such as robots' form of eyes having greater influences than facial structure, etc. Based on the result, each formative element's property that could have positive influence preference response on robot's face could be drawn and basic design guideline could also be suggested.

  • PDF

2D 얼굴 영상을 이용한 로봇의 감정인식 및 표현시스템 (Emotion Recognition and Expression System of Robot Based on 2D Facial Image)

  • 이동훈;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제13권4호
    • /
    • pp.371-376
    • /
    • 2007
  • This paper presents an emotion recognition and its expression system of an intelligent robot like a home robot or a service robot. Emotion recognition method in the robot is used by a facial image. We use a motion and a position of many facial features. apply a tracking algorithm to recognize a moving user in the mobile robot and eliminate a skin color of a hand and a background without a facial region by using the facial region detecting algorithm in objecting user image. After normalizer operations are the image enlarge or reduction by distance of the detecting facial region and the image revolution transformation by an angel of a face, the mobile robot can object the facial image of a fixing size. And materialize a multi feature selection algorithm to enable robot to recognize an emotion of user. In this paper, used a multi layer perceptron of Artificial Neural Network(ANN) as a pattern recognition art, and a Back Propagation(BP) algorithm as a learning algorithm. Emotion of user that robot recognized is expressed as a graphic LCD. At this time, change two coordinates as the number of times of emotion expressed in ANN, and change a parameter of facial elements(eyes, eyebrows, mouth) as the change of two coordinates. By materializing the system, expressed the complex emotion of human as the avatar of LCD.