• Title/Summary/Keyword: camera mouse

Search Result 66, Processing Time 0.035 seconds

Game-type Recognition Rehabilitation System based on Augmented Reality through Object Understanding (증강현실 기반의 물체 인식을 통한 게임형 인지 재활 시스템)

  • Lim, Myung-Jea;Jung, Hee-Woong;Lee, Ki-Young
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.11 no.3
    • /
    • pp.93-98
    • /
    • 2011
  • In this paper, we propose a game type cognitive rehabilitation system using marker-based augmented reality system for intelligence development of user. Existing cognitive rehabilitation with the help of others, or a keyboard or mouse operation required to relieve the discomfort, the marker card only control it led and is advanced the method which it applied. As a result, obtained through the camera calibration for image processing, and a Augmented Reality as well as mark detection. In this paper we presented a complete rotation of the model after checking through the whole form, through a combination of multiple markers by completing the interactive objects proceed with the rehabilitation process in a manner required by the target of interest to human rehabilitation and treatment.

The Development of Interactive Tiled Display Applications Using the iTILE Framework (iTILE 프레임워크를 이용한 인터랙티브 타일드 디스플레이 응용프로그램 개발)

  • Kim, Seok-Hwan;Kim, Min-Young;Kim, Su-Hwa;Kim, Ji-Hyoun;Min, Chul-Kee;Cho, Yong-Joo;Park, Kyoung-Shin
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.487-492
    • /
    • 2009
  • There will be many large displays in common public places, and we expect that these displays will give users many convenient interactive services. However, due to the physical size, typical mouse and keyboard are not suitable for user interactions on large displays. Recently there are many studies investigating on new interaction techniques and input devices developed for user interactions in a large display. In this paper, we describe the iTILE framework designed to help develop the interactive tiled display applications. Then we describe two iTILE applications using the tape switch interface and the tangible interface with infrared camera and markers.

  • PDF

HAND GESTURE INTERFACE FOR WEARABLE PC

  • Nishihara, Isao;Nakano, Shizuo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.664-667
    • /
    • 2009
  • There is strong demand to create wearable PC systems that can support the user outdoors. When we are outdoors, our movement makes it impossible to use traditional input devices such as keyboards and mice. We propose a hand gesture interface based on image processing to operate wearable PCs. The semi-transparent PC screen is displayed on the head mount display (HMD), and the user makes hand gestures to select icons on the screen. The user's hand is extracted from the images captured by a color camera mounted above the HMD. Since skin color can vary widely due to outdoor lighting effects, a key problem is accurately discrimination the hand from the background. The proposed method does not assume any fixed skin color space. First, the image is divided into blocks and blocks with similar average color are linked. Contiguous regions are then subjected to hand recognition. Blocks on the edges of the hand region are subdivided for more accurate finger discrimination. A change in hand shape is recognized as hand movement. Our current input interface associates a hand grasp with a mouse click. Tests on a prototype system confirm that the proposed method recognizes hand gestures accurately at high speed. We intend to develop a wider range of recognizable gestures.

  • PDF

Evaluation of the Radioimmunotherapy Using I-131 labeled Vascular Endothelial Growth Factor Receptor2 Antibody in Melanoma Xenograft Murine Model (흑색종에서의 I-131표지 혈관내피세포성장인자 수용체2항체를 이용한 방사면역치료 평가)

  • Kim, Eun-Mi;Jeong, Hwan-Jeong;Park, Eun-Hye;Cheong, Su-Jin;Lee, Chang-Moon;Jang, Kyu-Yun;Kim, Dong-Wook;Lim, Seok-Tae;Sohn, Myung-Hee
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.42 no.4
    • /
    • pp.307-313
    • /
    • 2008
  • Purpose: Vascular endothelial growth factor (VEGF) and its receptor, fetal liver kinase 1 (Flk-1), play an important role in vascular permeability and tumor angiogenesis. The aim of this study is to evaluate the therapeutic efficacy of $^{131}I$ labeled anti-Flk-1 monoclonal antibody (DC101) on the growth of melanoma tumor, which is known to be very aggressive in vivo. Materials and Methods: Balb/c nude mice were injected subcutaneously with melanoma cells in the right flank. Tumors were allowed to grow up to $200-250\;mm^3$ in volume. Gamma camera imaging and biodistribution studies were performed to identify an uptake of $^{131}I$-DC101 in various organs. Mice with tumor were randomly divided into five groups (10 mice per group) and injected intravenously; control PBS (group 1), $^{131}I$-DC101 $50\;{\mu}g/mouse$ (group 2), non-labeled DC101 $50\;{\mu}g/mouse$ (group 3), $^{131}I$-DC101 $30\;{\mu}g/mouse$ (group 4) and $15\;{\mu}g/mouse$ (group 5) every 3 or 4 days for 20 days. Tumor volume was measured with caliper twice a week. Results: In gamma camera images, the uptake of $^{131}I$-DC101 into tumor and thyroid was increased with time. Biodistribution results showed that the radioactivity of blood and other major organ was gradually decreased with time whereas tumor uptake was increased up to 48 hr and then decreased. After 4th injection of $^{131}I$-DC101, tumor volume of group 2 and 4 was significantly smaller than that group 1. After 5th injection, the tumor volume of group 5 also significantly reduced. Conclusion: These results indicated that delivery of $^{131}I$ to tumor using FlK-1 antibody, DC101, effectively blocks tumor growth in aggressive melanoma xenograft model.

Development of Optical Molecular Imaging System for the Acquisition of Bioluminescence Signals from Small Animals (소동물 발광영상 측정을 위한 광학분자영상기기의 개발)

  • Lee, Byeong-Il;Kim, Hyeon-Sik;Jeong, Hye-Jin;Lee, Hyung-Jae;Moon, Seung-Min;Kwon, Seung-Young;Choi, Eun-Seo;Jeong, Shin-Young;Bom, Hee-Seung;Min, Jung-Joon
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.43 no.4
    • /
    • pp.344-351
    • /
    • 2009
  • Purpose: Optical imaging is providing great advance and improvement in genetic and molecular imaging of animals and humans. Optical imaging system consists of optical imaging devices, which carry out major function for monitoring, tracing, and imaging in most of molecular in-vivo researches. In bio-luminescent imaging, small animals containing luciferase gene locally irradiate light, and emitted photons transmitted through skin of the small animals are imaged by using a high sensitive charged coupled device (CCD) camera. In this paper, we introduced optical imaging system for the image acquisition of bio-luminescent signals emitted from small animals. Materials and Methods: In the system, Nikon lens and four LED light sources were mounted at the inside of a dark box. A cooled CCD camera equipped with a control module was used. Results: We tested the performance of the optical imaging system using effendorf tube and light emitting bacteria which injected intravenously into CT26 tumor bearing nude mouse. The performance of implemented optical imaging system for bio-luminescence imaging was demonstrated and the feasibility of the system in small animal imaging application was proved. Conclusion: We anticipate this system could be a useful tool for the molecular imaging of small animals adaptable for various experimental conditions in future.

Infrared LED Pointer for Interactions in Collaborative Environments (협업 환경에서의 인터랙션을 위한 적외선 LED 포인터)

  • Jin, Yoon-Suk;Lee, Kyu-Hwa;Park, Jun
    • Journal of the HCI Society of Korea
    • /
    • v.2 no.1
    • /
    • pp.57-63
    • /
    • 2007
  • Our research was performed in order to implement a new pointing device for human-computer interactions in a collaborative environments based on Tiled Display system. We mainly focused on tracking the position of an infrared light source and applying our system to various areas. More than simple functionality of mouse clicking and pointing, we developed a device that could be used to help people communicate better with the computer. The strong point of our system is that it could be implemented in any place where a camera can be installed. Due to the fact that this system processes only infrared light, computational overhead for LED recognition was very low. Furthermore, by analyzing user's movement, various actions are expected to be performed with more convenience. This system was tested for presentation and game control.

  • PDF

Implementation of DID interface using gesture recognition (제스쳐 인식을 이용한 DID 인터페이스 구현)

  • Lee, Sang-Hun;Kim, Dae-Jin;Choi, Hong-Sub
    • Journal of Digital Contents Society
    • /
    • v.13 no.3
    • /
    • pp.343-352
    • /
    • 2012
  • In this paper, we implemented a touchless interface for DID(Digital Information Display) system using gesture recognition technique which includes both hand motion and hand shape recognition. Especially this touchless interface without extra attachments gives user both easier usage and spatial convenience. For hand motion recognition, two hand-motion's parameters such as a slope and a velocity were measured as a direction-based recognition way. And extraction of hand area image utilizing YCbCr color model and several image processing methods were adopted to recognize a hand shape recognition. These recognition methods are combined to generate various commands, such as, next-page, previous-page, screen-up, screen-down and mouse -click in oder to control DID system. Finally, experimental results showed the performance of 93% command recognition rate which is enough to confirm the possible application to commercial products.

Implementation of Commercial IWB Interface using Image Processing (영상처리를 이용한 상업용 전자칠판의 인터페이스 구현)

  • Ko, Eunsang;Rhee, Yang Won;Lee, Chang Woo
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.17 no.6
    • /
    • pp.19-24
    • /
    • 2012
  • In this paper we introduce a commercial interactive whiteboard (IWB) system named ImSensorTouch by ImSensor Inc. Using this interface system, we can control our computer through the interactive whiteboard screen just by touching it with your finger or a pen. The interface interacts with Windows operating system (OS) and is adaptable to changes of surroundings especially temperature, and illumination condition. The proposed system calculates the difference between a reference image and a current image captured by a camera in the optical receptive field. And the position making the difference is used to generate the position on Windows screen. Then, we send a mouse event on the position to Windows OS. We have implemented the system using a critical section(CS) with two threads for the reference frame update process in which an adaptive thresholding technique is periodically exploited to get reliable result. We expect the system is competitive and promises a bright future in the IWB market.

Gaze Detection Based on Facial Features and Linear Interpolation on Mobile Devices (모바일 기기에서의 얼굴 특징점 및 선형 보간법 기반 시선 추적)

  • Ko, You-Jin;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1089-1098
    • /
    • 2009
  • Recently, many researches of making more comfortable input device based on gaze detection technology have been performed in human computer interface. Previous researches were performed on the computer environment with a large sized monitor. With recent increase of using mobile device, the necessities of interfacing by gaze detection on mobile environment were also increased. In this paper, we research about the gaze detection method by using UMPC (Ultra-Mobile PC) and an embedded camera of UMPC based on face and facial feature detection by AAM (Active Appearance Model). This paper has following three originalities. First, different from previous research, we propose a method for tracking user's gaze position in mobile device which has a small sized screen. Second, in order to detect facial feature points, we use AAM. Third, gaze detection accuracy is not degraded according to Z distance based on the normalization of input features by using the features which are obtained in an initial user calibration stage. Experimental results showed that gaze detection error was 1.77 degrees and it was reduced by mouse dragging based on the additional facial movement.

  • PDF

Identification of Vestibular Organ Originated Information on Spatial Memory in Mice (마우스 공간지각과 기억 형성에 미치는 전정 유래 정보의 규명)

  • Han, Gyu Cheol;Kim, Minbum;Kim, Mi Joo
    • Research in Vestibular Science
    • /
    • v.17 no.4
    • /
    • pp.134-141
    • /
    • 2018
  • Objectives: We aimed to study the role of vestibular input on spatial memory performance in mice that had undergone bilateral surgical labyrinthectomy, semicircular canal (SCC) occlusion and 4G hypergravity exposure. Methods: Twelve to 16 weeks old ICR mice (n=30) were used for the experiment. The experimental group divided into 3 groups. One group had undergone bilateral chemical labyrinthectomy, and the other group had performed SCC occlusion surgery, and the last group was exposed to 4G hypergravity for 2 weeks. The movement of mice was recorded using camera in Y maze which had 3 radial arms (35 cm long, 7 cm high, 10 cm wide). We counted the number of visiting arms and analyzed the information of arm selection using program we developed before and after procedure. Results: The bilateral labyrinthectomy group which semicircular canal and otolithic function was impaired showed low behavioral performance and spacial memory. The semicircular canal occlusion with $CO_2$ laser group which only semicircular canal function was impaired showed no difference in performance activity and spatial memory. However the hypergravity exposure group in which only otolithic function impaired showed spatial memory function was affected but the behavioral performance was spared. The impairment of spatial memory recovered after a few days after exposure in hypergravity group. Conclusions: This spatial memory function was affected by bilateral vestibular loss. Space-related information processing seems to be determined by otolithic organ information rather than semicircular canals. Due to otolithic function impairment, spatial learning was impaired after exposure to gravity changes in animals and this impaired performance was compensated after normal gravity exposure.