• Title/Summary/Keyword: Camera mouse

Search Result 66, Processing Time 0.021 seconds

피지컬 인터페이스의 구현에 관한 연구

  • 오병근
    • Archives of design research
    • /
    • v.16 no.2
    • /
    • pp.131-140
    • /
    • 2003
  • The input for computer interaction design is very limited for the users to control the interface by only using keyboard and mouse. However, using the basic electrical engineering, the input design can be different from the existing method. Interactive art using computer technology is recently emersed, which is completed by people's participation. The electric signal transmitted in digital and analogue type from the interface controled by people to the computer can be used in multimedia interaction design. The electric circuit design will be necessary to transmit very safe electric signal from the interface. Electric switch, sensor, and camera technologies can be applied to input interface design, which would be alternative physical interaction without computer keyboard and mouse. This type of interaction design using human's body language and gesture would convey the richness of humanity.

  • PDF

A study on the osteoblast differentiation using osteocalcin gene promoter controlling luciferase expression (리포터유전자를 이용한 조골세포 분화정도에 관한 연구)

  • Kim, Kyoung-Hwa;Park, Yoon-Jeong;Lee, Yong-Moo;Han, Jung-Suk;Lee, Dong-Soo;Lee, Seung-Jin;Chung, Chong-Pyoung;Seol, Yang-Jo
    • Journal of Periodontal and Implant Science
    • /
    • v.36 no.4
    • /
    • pp.839-847
    • /
    • 2006
  • The aim of this study is to monitor reporter gene expression under osteocalcin gene promoter, using a real-time molecular imaging system, as tool to investigate osteoblast differentiation. The promoter region of mouse osteocalcin gene 2 (mOG2), the best-characterized osteoblast-specific gene, was inserted in promoterless luciferase reporter vector. Expression of reporter gene was confirmed and relationship between the reporter gene expression and osteoblastic differentiation was evaluated. Gene expression according to osteoblstic differentiation on biomaterials, utilizing a real-time molecular imaging system, was monitored. Luciferase was expressed at the only cells transduced with pGL4/mOGP and the level of expression was statistically higher at cells cultured in mineralization medium than cells in growth medium. CCCD camera detected the luciferase expression and was visible differentiation-dependent intensity of luminescence. The cells produced osteocalcin with time-dependent increment in BMP-2 treated cells and there was difference between BMP-2 treated cells and untreated cells at 14days. There was difference at the level of luciferase expression under pGL4/mOGP between BMP-2 treated cells and untreated cells at 3days. CCCD camera detected the luciferase expression at cells transduced with pGL4/mOGP on Ti disc and was visible differentiation-dependent intensity of luminescence This study shows that 1) expression of luciferase is regulated by the mouse OC promoter, 2) the CCCD detection system is a reliable quantitative gene detection tool for the osteoblast differentiation, 3) the dynamics of mouse OC promoter regulation during osteoblast differentiation is achieved in real time and quantitatively on biomaterial. The present system is a very reliable system for monitoring of osteoblast differentiation in real time and may be used for monitoring the effects of growth factors, drug, cytokines and biomaterials on osteoblast differentiation in animal.

An alternative method for smartphone input using AR markers

  • Kang, Yuna;Han, Soonhung
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.3
    • /
    • pp.153-160
    • /
    • 2014
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but among middle-aged people as well. Most smartphones adopt capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen; this causes difficulties in making precise inputs. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. A marker is placed in front of the smartphone camera. Then, the camera image of the marker is analyzed to determine the position of the marker as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

Emotion Recognition by CCD Color Image (CCD 컬러영상에 의한 감성인식)

  • Lee, Sang-Yoon;Joo, Young-Hoon;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.12 no.2
    • /
    • pp.97-102
    • /
    • 2002
  • In this paper, we propose the technique for recognizing the human s emotion by using the CCD color image. To do this, we first get the face image by using skin-color from the original color image acquired by the CCD camera. And we propose the method for finding man s feature points(eyebrows, eye, nose, mouse) from the face image and the geometrical method for recognizing human s emotion (surprise, anger, happiness, sadness) from the structural correlation of man s feature feints. The proposed method in this paper recognize the human s emotion by learning the neural network. Finally, we have proven the effectiveness of the Proposed method through the experimentation.

Design and Implementation of PC-Mechanic Education Application System Using Image Processing (영상처리를 이용한 PC 내부구조 학습 어플리케이션 설계 및 구현)

  • Kim, Won-Jin;Kim, Hyung-Ook;Jo, Sung-Eun;Jang, Soo-Jeong;Moon, Il-Young
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.3 no.2
    • /
    • pp.93-99
    • /
    • 2011
  • We introduce the application what using the MultiTouch-Table of the PC-mechanic Certification. Thesedays, People does't use the Mouse and Keyboard and use people gesture. We introduce Graphic and Image by addition. Theseday, MultiTouch-Table is so famous. We use it the multitouch-table to on 3D Maxs and C#. We help them to get the certification using the component Scale and Drags through the camera view and then include the PC-Mechanic question of domestic.

  • PDF

Dynamic Manipulation of a Virtual Object in Marker-less AR system Based on Both Human Hands

  • Chun, Jun-Chul;Lee, Byung-Sung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.4
    • /
    • pp.618-632
    • /
    • 2010
  • This paper presents a novel approach to control the augmented reality (AR) objects robustly in a marker-less AR system by fingertip tracking and hand pattern recognition. It is known that one of the promising ways to develop a marker-less AR system is using human's body such as hand or face for replacing traditional fiducial markers. This paper introduces a real-time method to manipulate the overlaid virtual objects dynamically in a marker-less AR system using both hands with a single camera. The left bare hand is considered as a virtual marker in the marker-less AR system and the right hand is used as a hand mouse. To build the marker-less system, we utilize a skin-color model for hand shape detection and curvature-based fingertip detection from an input video image. Using the detected fingertips the camera pose are estimated to overlay virtual objects on the hand coordinate system. In order to manipulate the virtual objects rendered on the marker-less AR system dynamically, a vision-based hand control interface, which exploits the fingertip tracking for the movement of the objects and pattern matching for the hand command initiation, is developed. From the experiments, we can prove that the proposed and developed system can control the objects dynamically in a convenient fashion.

Human-Computer Interface using the Eye-Gaze Direction (눈의 응시 방향을 이용한 인간-컴퓨터간의 인터페이스에 관한 연구)

  • Kim, Do-Hyoung;Kim, Jea-Hean;Chung, Myung-Jin
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.38 no.6
    • /
    • pp.46-56
    • /
    • 2001
  • In this paper we propose an efficient approach for real-time eye-gaze tracking from image sequence and magnetic sensory information. The inputs to the eye-gaze tracking system are images taken by a camera and data from a magnetic sensor. The measuring data are sufficient to describe the eye and head movement, because the camera and the receiver of a magnetic sensor are stationary with respect to the head. Experimental result shows the validity of real time application aspect of the proposed system and also shows the feasibility of the system as using a new computer interface instead of the mouse.

  • PDF

User Identification System Based on Iris Information Using a Mouse (홍채 정보 기반 마우스를 활용한 사용자 인증 시스템)

  • Kim Sin-Hong;Rho Kwang-Hyun;Moon Soon-Hwan
    • The Journal of the Korea Contents Association
    • /
    • v.6 no.1
    • /
    • pp.143-150
    • /
    • 2006
  • Recently, the fields such as internet banking and electronic commerce are more and more growing due to the growth of practical use of personal computer and the progress of communication technology, So importance of information security has been increased. In fact, traditional identification systems are inherently insecure because the personal identification information can be forgotten, stolen or lost. In this paper, we propose an identification system that can decide whether the user is registered based on iris information using a mouse. The proposed system is mounted a CCD camera and an illumination device on general type mouse. Then it decides whether the user is registered after the acquired image are processed and analyzed. This system gives a PC user the advantage of low-cost and convenience without necessity preparing high-cost equipment for biometrics when using a identification system.

  • PDF

The effects of the usability of products on user's emotions - with emphasis on suggestion of methods for measuring user's emotions expressed while using a product -

  • Jeong, Sang-Hoon
    • Archives of design research
    • /
    • v.20 no.2 s.70
    • /
    • pp.5-16
    • /
    • 2007
  • The main objective of our research is analyzing user's emotional changes while using a product, to reveal the influence of usability on human emotions. In this study we have extracted some emotional words that can come up during user interaction with a product and reveal emotional changes through three methods. Finally, we extracted 88 emotional words for measuring user's emotions expressed while using products. And we categorized the 88 words to form 6 groups by using factor analysis. The 6 categories that were extracted as a result of this study were found to be user's representative emotions expressed while using products. It is expected that emotional words and user's representative emotions extracted in this study will be used as subjective evaluation data that is required to measure user's emotional changes while using a product. Also, we proposed the effective methods for measuring user's emotion expressed while using a product in the environment which is natural and accessible for the field of design, by using the emotion mouse and the Eyegaze. An examinee performs several tasks with the emotion mouse through the mobile phone simulator on the computer monitor connected to the Eyegaze. While testing, the emotion mouse senses user's EDA and PPG and transmits the data to the computer. In addition, the Eyegaze can observe the change of pupil size. And a video camera records user's facial expression while testing. After each testing, a subjective evaluation on the emotional changes expressed by the user is performed by the user him/herself using the emotional words extracted from the above study. We aim to evaluate the satisfaction level of usability of the product and compare it with the actual experiment results. Through continuous studies based on these researches, we hope to supply a basic framework for the development of interface with consideration to the user's emotions.

  • PDF

A study on 3D Pottery Modeling based on Web (웹기반 3D 도자기 모델링에 관한 연구)

  • Park, Gyoung Bae
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.12
    • /
    • pp.209-217
    • /
    • 2012
  • In this paper, I proposed new system that a user makes modeling 3D symmetric pottery using mouse and can confirm the result immediately in internet browser. The main advantage of proposed system is that users who have no specialized knowledge about 3D graphic can easily create 3D objects. And a user can use it that has only PC connected network and mouse without additional devices as like expensive haptic and camera device. For developing proposed system, VRML/X3D that is International Standard language for virtual reality and 3D graphics was used. Because it was born based on internet that is different from other 3D graphic languages, it was able to interact and navigate with users. With those features and high completeness of 3D pottery realization using mouse considered, the system may be useful and is superior in performance to other pottery modeling system.