• Title/Summary/Keyword: monitoring camera

Search Result 750, Processing Time 0.031 seconds

Position Control of Mobile Robot for Human-Following in Intelligent Space with Distributed Sensors

  • Jin Tae-Seok;Lee Jang-Myung;Hashimoto Hideki
    • International Journal of Control, Automation, and Systems
    • /
    • v.4 no.2
    • /
    • pp.204-216
    • /
    • 2006
  • Latest advances in hardware technology and state of the art of mobile robot and artificial intelligence research can be employed to develop autonomous and distributed monitoring systems. And mobile service robot requires the perception of its present position to coexist with humans and support humans effectively in populated environments. To realize these abilities, robot needs to keep track of relevant changes in the environment. This paper proposes a localization of mobile robot using the images by distributed intelligent networked devices (DINDs) in intelligent space (ISpace) is used in order to achieve these goals. This scheme combines data from the observed position using dead-reckoning sensors and the estimated position using images of moving object, such as those of a walking human, used to determine the moving location of a mobile robot. The moving object is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the intelligent space. Using the a priori known path of a moving object and a perspective camera model, the geometric constraint equations that represent the relation between image frame coordinates of a moving object and the estimated position of the robot are derived. The proposed method utilizes the error between the observed and estimated image coordinates to localize the mobile robot, and the Kalman filtering scheme is used to estimate the location of moving robot. The proposed approach is applied for a mobile robot in ISpace to show the reduction of uncertainty in the determining of the location of the mobile robot. Its performance is verified by computer simulation and experiment.

PID Controled UAV Monitoring System for Fire-Event Detection (PID 제어 UAV를 이용한 발화 감지 시스템의 구현)

  • Choi, Jeong-Wook;Kim, Bo-Seong;Yu, Je-Min;Choi, Ji-Hoon;Lee, Seung-Dae
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.15 no.1
    • /
    • pp.1-8
    • /
    • 2020
  • If a dangerous situation arises in a place where out of reach from the human, UAVs can be used to determine the size and location of the situation to reduce the further damage. With this in mind, this paper sets the minimum value of the roll, pitch, and yaw using beta flight to detect the UAV's smooth hovering, integration, and derivative (PID) values to ensure that the UAV stays horizontal, minimizing errors for safe hovering, and the camera uses Open CV to install the Raspberry Pi program and then HSV (color, saturation, Brightness) using the color palette, the filter is black and white except for the red color, which is the closest to the fire we want, so that the UAV detects the image in the air in real time. Finally, it was confirmed that hovering was possible at a height of 0.5 to 5m, and red color recognition was possible at a distance of 5cm and at a distance of 5m.

Development of vision system for the recognition of character image which was included at the slab image (슬라브 영상에 포함된 문자영상의 인식을 위한 비전시스템의 개발)

  • Park, Sang-Gug
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.12 no.1
    • /
    • pp.95-100
    • /
    • 2007
  • In the steel & iron processing line, some characters are marked for the material management in the surface of material. This paper describes about the developed results of vision system for the recognition of material management characters, which was included in the slab image. Our vision system for the character recognition includes that CCD camera system which acquire slab image, optical transmission system which transmit captured image to the long distance, input and output system for the interface with existing system and monitoring system for the checking of recognition results. We have installed our vision system at the continuous casting line and tested. Also, we have performed inspection of durability, reliability and recognition rate. Through the testing, we have confirmed that our system have high recognition rate, 97.4%.

  • PDF

A Development of Urban Farm Management System based on USN (USN 기반의 도시 농업 관리 시스템 개발)

  • Ryu, Dae-Hyun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.8 no.12
    • /
    • pp.1917-1922
    • /
    • 2013
  • The objective of this study is developing urban farm management system based on USN for remote monitoring and control. This system makes it easy to manage urban farm and make the database of collected information for to build the best environment for growing crops. For this, we build a green house and installed several types of sensors and camera through which the remote sensing information collected. In addition, building a web page for user convenience and information in real time to enable control. We confirmed experimentally all functions related to stability for a long period of time through field tests such as collection and transfer of information, environmental control in green house. It will be convenient for farmers to grow crops by providing the time and space constraints and a lot of flexibility. In addition, factory, office, home like environment, including facilities for it will be possible to extend.

Measurement of Soil Deformation around the Tip of Model Pile by Close-Range Photogrammetry (근접 사진측량에 의한 모형말뚝 선단부 주변의 지반 변형 측정)

  • Lee, Chang No;Oh, Jae Hong
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.31 no.2
    • /
    • pp.173-180
    • /
    • 2013
  • In this paper, we studied on measurement of soil deformation around the tip of model pile by close-range photogrammetry. The rigorous bundle adjustment method was utilized to monitor the soil deformation in the laboratory model pile-load test as function of incremental penetration of the pile. Control points were installed on the frame of the laboratory model box case and more than 150 target points were inserted inside the soil around the model pile and on the surface. Four overlapping images including three horizontal and one vertical image were acquired by a non-metric camera for each penetration step. The images were processed to automatically locate the control and target points in the images for the self-calibration and the bundle adjustment. During the bundle adjustment, the refraction index of the acrylic case of the laboratory model was accounted for accurate measurement. The experiment showed the proposed approach enabled the automated photogrammetric monitoring of soil deformation around the tip of model pile.

Displacement Measurement of a Floating Structure Model Using a Video Data (동영상을 이용한 부유구조물 모형의 변위 관측)

  • Han, Dong Yeob;Kim, Hyun Woo;Kim, Jae Min
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.31 no.2
    • /
    • pp.159-164
    • /
    • 2013
  • It is well known that a single moving camera video is capable of extracting the 3-dimensional position of an object. With this in mind, current research performed image-based monitoring to establish a floating structure model using a camcorder system. Following this, the present study extracted frame images from digital camcorder video clips and matched the interest points to obtain relative 3D coordinates for both regular and irregular wave conditions. Then, the researchers evaluated the transformation accuracy of the modified SURF-based matching and image-based displacement estimation of the floating structure model in regular wave condition. For the regular wave condition, the wave generator's setting value was 3.0 sec and the cycle of the image-based displacement result was 2.993 sec. Taking into account mechanical error, these values can be considered as very similar. In terms of visual inspection, the researchers observed the shape of a regular wave in the 3-dimensional and 1-dimensional figures through the projection on X Y Z axis. In conclusion, it was possible to calculate the displacement of a floating structure module in near real-time using an average digital camcorder with 30fps video.

Koh Chang Island Eco-Tourism Mapping by Balloon-born Remote Sensing Imagery System

  • Kusanagi, Michiro;Nogami, Jun;Choomnoommanee, Tanapati;Laosuwan, Teerawong;Penaflor, Eileen;Shulian, Niu;Zuyan, Yao
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.894-896
    • /
    • 2003
  • Koh Chang Island is located near the east border of Thailand. The government of Thailand promotes the island as a model of eco-tourism spots. The Island undeveloped until recent years, is expected to change to major tourist attraction. 'Digital Koh Chang project' has thus. The main objective of this project is to monitor the environment and land use status of the island and to support its sound development. In March 2003, a field survey of this project was planned and field data were collected using both airborne and ground platforms and an ocean vessel. These data were combined with satellite data in the laboratory. This presentation is all balloon-born system field operation. A 5-meter length balloon filled with Helium gas was used, whose payload consisted of two RGB standard color digital still cameras, two directional rotating servo motors, a camera mount cradle as well as signal transmitting and receiving components. A series of aerial high-resolution digital images were rather easily obtained using this inexpensive system, making it possible to monitor intended landscape features in a specific field. Design of simple, low-cost and easily transportable flying platforms and local field surveys using them are useful for getting local ground truth data to calibrate satellite or airborne-based RS data. The design analysis to upgrade the system is further investigated.

  • PDF

Recent Developments in Nuclear Medicine Instrumentation (최근 핵의학 영상 기기 발전 동향)

  • Kim, Joon-Young;Choi, Yong;Kim, Jong-Ho;Im, Ki-Chun;Choe, Yearn-Seong;Lee, Kyung-Han;Kim, Sang-Eun;Kim, Byung-Tae
    • The Korean Journal of Nuclear Medicine
    • /
    • v.32 no.6
    • /
    • pp.471-481
    • /
    • 1998
  • The goals of developments in nuclear medicine instrumentation are to offer a higher-quality image and to aid diagnosis, prognosis assessment or treatment planning and monitoring. It is necessary for physicists and engineers to improve or design new instrumentation and techniques, and to implement, validate, and apply these new approaches in the practice of nuclear medicine. The researches in physical properties of detectors and crystal materials and advances in image analysis technology have improved quantitative and diagnostic accuracy of nuclear medicine images. This review article presents recent developments in nuclear medicine instrumentation, including scatter and attenuation correction, new detector technology, tomographic image reconstruction methods, 511 keV imaging, dual modality imaging device, small gamma camera, PET developments, image display and analysis methods.

  • PDF

Individual Pig Detection Using Kinect Depth Information and Convolutional Neural Network (키넥트 깊이 정보와 컨볼루션 신경망을 이용한 개별 돼지의 탐지)

  • Lee, Junhee;Lee, Jonguk;Park, Daihee;Chung, Yongwha
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.2
    • /
    • pp.1-10
    • /
    • 2018
  • Aggression among pigs adversely affects economic returns and animal welfare in intensive pigsties. Recently, some studies have applied information technology to a livestock management system to minimize the damage resulting from such anomalies. Nonetheless, detecting each pig in a crowed pigsty is still challenging problem. In this paper, we propose a new Kinect camera and deep learning-based monitoring system for the detection of the individual pigs. The proposed system is characterized as follows. 1) The background subtraction method and depth-threshold are used to detect only standing-pigs in the Kinect-depth image. 2) The standing-pigs are detected by using YOLO (You Only Look Once) which is the fastest and most accurate model in deep learning algorithms. Our experimental results show that this method is effective for detecting individual pigs in real time in terms of both cost-effectiveness (using a low-cost Kinect depth sensor) and accuracy (average 99.40% detection accuracies).

Real-time monitoring of berthing/deberthing operations process for entering/leaving vessels using VTS system in Busan northern harbor, Korea (부산 북항에서 VTS 시스템에 의한 출입항 선박의 접이안 작업과정의 실시간 모니터링)

  • Lee, Dae-Jae
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.45 no.3
    • /
    • pp.165-176
    • /
    • 2009
  • The process of berthing/deberthing operations for entering/leaving vessels in Busan northern harbor was analyzed and evaluated by using an integrated VTS(vessel traffic service) system installed in the ship training center of Pukyong National University, Busan, Korea. The integrated VTS system used in this study was consisted of ARPA radar, ECDIS(electronic chart display and information system), backup(recording) system, CCTV(closed-circuit television) camera system, gyro-compass, differential GPS receiver, anemometer, AIS(automatic identification system), VHF(very high frequency) communication system, etc. The network of these systems was designed to communicate with each other automatically and to exchange the critical information about the course, speed, position and intended routes of other traffic vessels in the navigational channel and Busan northern harbor. To evaluate quantitatively the overall dynamic situation such as maneuvering motions for target vessel and its tugboats while in transit to and from the berth structure inside a harbor, all traffic information in Busan northern harbor was automatically acquired, displayed, evaluated and recorded. The results obtained in this study suggest that the real-time tracking information of traffic vessels acquired by using an integrated VTS system can be used as a useful reference data in evaluating and analyzing exactly the dynamic situation such as the collision between ship and berth structure, in the process of berthing/deberthing operations for entering/leaving vessels in the confined waters and harbor.