• Title/Summary/Keyword: Object Position

Search Result 1,226, Processing Time 0.031 seconds

Object Recognition Using 3D RFID System (3D REID 시스템을 이용한 사물 인식)

  • Roh Se-gon;Lee Young Hoon;Choi Hyouk Ryeol
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.12
    • /
    • pp.1027-1038
    • /
    • 2005
  • Object recognition in the field of robotics generally has depended on a computer vision system. Recently, RFID(Radio Frequency IDentification) has been suggested as technology that supports object recognition. This paper, introduces the advanced RFID-based recognition using a novel tag which is named a 3D tag. The 3D tag was designed to facilitate object recognition. The proposed RFID system not only detects the existence of an object, but also estimates the orientation and position of the object. These characteristics allow the robot to reduce considerably its dependence on other sensors for object recognition. In this paper, we analyze the characteristics of the 3D tag-based RFID system. In addition, the estimation methods of position and orientation using the system are discussed.

Nearest Neighbor Query Processing in the Mobile Environment

  • Choi Hyun Mi;Jung Young Jin;Lee Eung Jae;Ryu Keun Ho
    • Proceedings of the KSRS Conference
    • /
    • 2004.10a
    • /
    • pp.677-680
    • /
    • 2004
  • In the mobile environment, according to the movement of the object, the query finds the nearest special object or place from object position. However, because query object moves continuously in the mobile environment, query demand changes according to the direction attribute of query object. Also, in the case of moving of query object and simply the minimum distance value of query result, sometimes we find the result against the query object direction. Especially, in most road condition, as user has to return after reaching U-turn area, user rather spends time and cost. Therefore, in order to solve those problems, in this paper we propose the nearest neighbor method considering moving object position and direction for mobile recommendation system.

  • PDF

Object Tracking Algorithm for a Mobile Robot Using Ultrasonic Sensors

  • Park, M.G.;Lee, M.C.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.44.5-44
    • /
    • 2001
  • This paper proposes the algorithm which a mobile robot tracks the object captured by ultrasonic sensors of the robot and automatically generates a path according to the object In the proposed algorithm, a robot detects movements of the object as using ultrasonic sensors and then the robot follows the moving object. This algorithm simplifies robot path planning. The eight ultrasonic sensors on the robot capture distances between the robot and objects. The robot detects the movements of the object by using the changes of the distances captured by ultrasonic sensors. The target position of the robot is determined as the position of the detected moving object. The robot follows the object according to this movement strategy. The effectiveness of the proposed algorithm is verified through experiments.

  • PDF

A Position Measurements of Moving Object in 2D Plane (2차원 평면상에서 이동하는 물체의 위치측정)

  • Ro, Jae-Hee;Lee, Yong-Jung;Choi, Jae-Ha;Ro, Young-Shick;Lee, Yang-Burm
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.48 no.12
    • /
    • pp.1537-1543
    • /
    • 1999
  • In this paper, PSD(Position Sensitive Detector) sensor system that estimates position for moving objects in 2D plane is developed. PSD sensor is used to measure the position of an incidence light in real-time. To get the position of light source of moving target, a new parameter calibration algorithm and neural network technique are proposed and applied. Real-time position measurements of the mobile robot with light source is examined to validate the proposed method. It is shown that the proposed technique provides accurate position estimation of the moving object.

  • PDF

Position Improvement of a Human-Following Mobile Robot Using Image Information of Walking Human (보행자의 영상정보를 이용한 인간추종 이동로봇의 위치 개선)

  • Jin Tae-Seok;Lee Dong-Heui;Lee Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.5
    • /
    • pp.398-405
    • /
    • 2005
  • The intelligent robots that will be needed in the near future are human-friendly robots that are able to coexist with humans and support humans effectively. To realize this, robots need to recognize their position and posture in known environment as well as unknown environment. Moreover, it is necessary for their localization to occur naturally. It is desirable for a robot to estimate of his position by solving uncertainty for mobile robot navigation, as one of the best important problems. In this paper, we describe a method for the localization of a mobile robot using image information of a moving object. This method combines the observed position from dead-reckoning sensors and the estimated position from the images captured by a fixed camera to localize a mobile robot. Using a priori known path of a moving object in the world coordinates and a perspective camera model, we derive the geometric constraint equations which represent the relation between image frame coordinates for a moving object and the estimated robot's position. Also, the control method is proposed to estimate position and direction between the walking human and the mobile robot, and the Kalman filter scheme is used for the estimation of the mobile robot localization. And its performance is verified by the computer simulation and the experiment.

A Study on the Distance and Object Recognition Applying the Airborne Ultrasonic Sensor (공중 초음파 센서를 응용한 거리 형상인식에 관한 연구)

  • Han, E.K.;Park, I.G.
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.10 no.1
    • /
    • pp.10-17
    • /
    • 1990
  • Recently, object recognition ultrasonic sensor is being used with automatization of industrial machine. Points which characterize the object can be deleted by measuring the propagation time of ultrasonic impulse and azimuth which gives its maximum amplitude, and from these points shape, position and orientation of the object are deduced. A new measuring method is adopted, where the distance to the object is calculated by sound reflection time which is measured from O-cross point of sound wave, and azimuth is measured by angle indicating maximum amplitude. The measuring accuracy of 1.0mm for distance and $0.5-2^{\circ}$ for azimuth have been accomplished. By rotational scanning of sensor the characteristic point of an object can be known and it gives the information of its shape, position and orientation. Experimental results showed that the object of some complicated shape can be recognized, which suggest its applicability to robot.

  • PDF

Tactile localization Using Whisker Tactile Sensors (수염 촉각 센서를 이용한 물체 위치 판별 그리고 이에 따른 로봇의 상대적 위치 제어 방법)

  • Kim, Dae-Eun;Moeller, Ralf
    • Proceedings of the IEEK Conference
    • /
    • 2008.06a
    • /
    • pp.1061-1062
    • /
    • 2008
  • Rodents demonstrate an outstanding capability for tactile perceptions using their whiskers. The mechanoreceptors in the whisker follicles are responsive to the deflections or vibrations of the whisker beams. It is believed that the sensor processing can determine the location of an object in touch, that is, the angular position and direction of the object. We designed artificial whiskers modelling the real whiskers and tested tactile localization. The robotic system needs to adjust its position against an object to help the shape recognition. We show a robotic adjustment of position based on tactile localization. The behaviour uses deflection curves of the whisker sensors for every sweep of whiskers and estimates the location of a target object.

  • PDF

DSP Implementation of The Position Location System in Underwater Channel Environments (수중환경에서 위치추적 시스템의 DSP 구현)

  • Ko, Hak-Lim;Lim, Yong-Kon;Lee, Deok-Hwan
    • The Journal of the Acoustical Society of Korea
    • /
    • v.26 no.1
    • /
    • pp.48-54
    • /
    • 2007
  • In this paper we have implemented a 3-D PL (Position Location) system to estimate the 3-dimensional position of a moving object in underwater environments. In this research, we let four sensors fixed in different Positions and moving sensorsto communicate with each other to find the 3-dementianal positions for both the fixed and moving objects. Using this we were also able to control the moving object remotely. When finding the position, we calculated the norm of the Jacobian matrix every iteration in the Newton algorithm. Also by using a different initial value for calculating the solution when the norm became higher than the critical value and the solution from the inverse matrix became unstable, we could find a more reliable position for the moving object. The proposed algorithm was used in implementing a DSP system capable of real-time position location. To verify the performance, experiments were done in a water tank. As a result we could see that our system could located the position of an object every 2 seconds with a error range of 5cm.

Asynchronous Sensor Fusion using Multi-rate Kalman Filter (다중주기 칼만 필터를 이용한 비동기 센서 융합)

  • Son, Young Seop;Kim, Wonhee;Lee, Seung-Hi;Chung, Chung Choo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.11
    • /
    • pp.1551-1558
    • /
    • 2014
  • We propose a multi-rate sensor fusion of vision and radar using Kalman filter to solve problems of asynchronized and multi-rate sampling periods in object vehicle tracking. A model based prediction of object vehicles is performed with a decentralized multi-rate Kalman filter for each sensor (vision and radar sensors.) To obtain the improvement in the performance of position prediction, different weighting is applied to each sensor's predicted object position from the multi-rate Kalman filter. The proposed method can provide estimated position of the object vehicles at every sampling time of ECU. The Mahalanobis distance is used to make correspondence among the measured and predicted objects. Through the experimental results, we validate that the post-processed fusion data give us improved tracking performance. The proposed method obtained two times improvement in the object tracking performance compared to single sensor method (camera or radar sensor) in the view point of roots mean square error.