• Title/Summary/Keyword: Omni Directional

Search Result 355, Processing Time 0.03 seconds

Design of a Cleaning Robot with Omni-directional Mobility (전방향 이동이 가능한 청소로봇의 구동장치)

  • Jin, Taeseok
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2014.10a
    • /
    • pp.899-901
    • /
    • 2014
  • This paper presents design of a cleaning robot with an omni-directional mobility. The cleaning robot driven with three wheels has been developed and Those omni-wheels enable the robot to move in any directions so that lateral movement is possible. Three wheels mechanism using ball-type tire has been developed to realize a holonomic omni-diredctional robot.

  • PDF

An Experimental Study on Control and Development of an Omni-directional Mobile Robot (전방향 이동로봇의 제작과 제어에 관한 실험연구)

  • Lee, Jeong Hyung;Jung, Seul
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.24 no.4
    • /
    • pp.412-417
    • /
    • 2014
  • This paper presents the development and control of an omni-directional holonomic mobile robot platform, which is equipped with three lateral orthogonal-wheel assemblies. Omni-directionality can be achieved with decoupled rotational and translational motions. Simulation studies on collision avoidance are conducted. A real robot is built and its hardware is implemented to control the robot. Control algorithm is embedded on DSP and FPGA chips. Hardware for motor control such as PWM, encoder counter, serial communication modules is implemented on an FPGA chip. Experimental studies of following joystick commands are performed to demonstrate the functionality and controllability of the robot.

Georeferencing of Indoor Omni-Directional Images Acquired by a Rotating Line Camera (회전식 라인 카메라로 획득한 실내 전방위 영상의 지오레퍼런싱)

  • Oh, So-Jung;Lee, Im-Pyeong
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.30 no.2
    • /
    • pp.211-221
    • /
    • 2012
  • To utilize omni-directional images acquired by a rotating line camera for indoor spatial information services, we should register precisely the images with respect to an indoor coordinate system. In this study, we thus develop a georeferencing method to estimate the exterior orientation parameters of an omni-directional image - the position and attitude of the camera at the acquisition time. First, we derive the collinearity equations for the omni-directional image by geometrically modeling the rotating line camera. We then estimate the exterior orientation parameters using the collinearity equations with indoor control points. The experimental results from the application to real data indicate that the exterior orientation parameters is estimated with the precision of 1.4 mm and $0.05^{\circ}$ for the position and attitude, respectively. The residuals are within 3 and 10 pixels in horizontal and vertical directions, respectively. Particularly, the residuals in the vertical direction retain systematic errors mainly due to the lens distortion, which should be eliminated through a camera calibration process. Using omni-directional images georeferenced precisely with the proposed method, we can generate high resolution indoor 3D models and sophisticated augmented reality services based on the models.

A Simple CPW-Fed UWB Antenna Design

  • Park, Sang-Yong;Oh, Seon-Jeong;Park, Jong-Kweon
    • Journal of electromagnetic engineering and science
    • /
    • v.10 no.1
    • /
    • pp.13-17
    • /
    • 2010
  • In this paper, we have described a simple CPW-fed UWB antenna for wireless UWB communication. The proposed antenna consists of two symmetrical strips having two steps and CPW feeding. Two techniques(symmetrical structure, two steps) are used to produce good low-dispersion and impedance matching. The proposed UWB antenna has an omni directional radiation pattern, compact size, low dispersion, and low cost.

Depth estimation by using a double conic projection (이중원뿔 투영을 이용한 거리의 추정)

  • 김완수;조형석;김성권
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1411-1414
    • /
    • 1997
  • It is essential to obtain a distane informaion in order to completely execute assembly tasks such as a grasping and an insertion. In this paper, we propose a method estimating a measurement distance from a sensor to an object through using the omni-directional image sensing system for assembly(OISSA) and show its features and feasibility by a computer simulation. The method, utilizing a forwarded motion stereo technique, is simple to search the corresponding points and possible to immediatiely obtain a three-dimensional 2.pi.-shape information.

  • PDF

Multi-views face detection in Omni-directional camera for non-intrusive iris recognition (비강압적 홍채 인식을 위한 전 방향 카메라에서의 다각도 얼굴 검출)

  • 이현수;배광혁;김재희;박강령
    • Proceedings of the IEEK Conference
    • /
    • 2003.11b
    • /
    • pp.115-118
    • /
    • 2003
  • This paper describes a system of detecting multi-views faces and estimating their face poses in an omni-directional camera environment for non-intrusive iris recognition. The paper is divided into two parts; First, moving region is identified by using difference-image information. Then this region is analyzed with face-color information to find the face candidate region. Second part is applying PCA (Principal Component Analysis) to detect multi-view faces, to estimate face pose.

  • PDF

Omni-directional 3D Display System for Collaborative Work on Round Table

  • Okumura, Mitsuru;Sakamoto, Kunio;Nomura, Shusaku;Hirotomi, Tetsuya;Shiwaku, Kuninori;Hirakawa, Masahito
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 2009.10a
    • /
    • pp.861-864
    • /
    • 2009
  • The authors have developed the display system which can be viewed from any direction. In this paper, we propose an omni-directional 3D display system for cooperative activity on a round table.

  • PDF

A MAC Protocol using Directiona Antenna to Solve Deafness Problem (Deafness 문제를 해결하기 위한 지향성 MAC 프로토콜)

  • An, Han-Soon;Hong, Sung-Peel;Kahng, Hyun-Kook
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2007.11a
    • /
    • pp.917-920
    • /
    • 2007
  • 무선 Ad-hoc 네트워크에서는 주로 IEEE 802.11 MAC 프로토콜을 이용한다. IEEE 802.11 MAC 프로토콜은 제어 메시지인 RTS-CTS를 통해서 채널 예약을 하고 데이터를 전송하는 방식으로 모든 통신에 Omni-directional 안테나를 이용하여 전송한다. 본 논문에서는 기존 IEEE 802.11 MAC 프로토콜보다 성능을 향상시키기 위해서 directional 안테나를 이용한 MAC 프로토콜을 사용한다. Directional 안테나를 사용한 MAC 프로토콜은 IEEE 802.11 MAC 프로토콜에 비해서 Spatial Reuse를 증가함으로서 채널 자원을 더욱 효율적으로 사용하는 것이 가능하다. 또한 Directional 안테나의 사용은 안테나의 지향성에 따른 안테나 이득 및 전송 범위의 증가 그리고 전송 범위를 Omni-directional 안테나와 동일하게 적용할 경우에는 저 전력 통신이 가능하다는 장점을 가지고 있다. 그러나 Directional 안테나의 사용은 IEEE 802.11 MAC보다 좋은 성능을 갖기는 하지만 새로운 문제들이 발생한다. 이러한 문제들로는 New Hidden Terminal, Deafness, Capture, 그리고 위치 인식에 관련된 문제들이 발생한다. 본 논문에서서 위에서 언급한 Directional 안테나의 이점과 그리고 문제점에 대해서 설명하고, 이러한 문제들 중에 Deafness 문제를 완화시킬 수 있는 방법을 제안한다. 그리고 QualNet 4.0을 이용한 시뮬레이션을 통해서 제안된 프로토콜의 성능을 평가한다.

  • PDF

An Effective Coverage Extension Scheme for Trisector Cellular Systems using Multi-hop Relay based on IEEE 802.16j (IEEE 802.16j 기반의 중계기를 도입한 3섹터 셀룰러 시스템에서 효율적인 기지국 커버리지 확장 기법)

  • Yoo, Chang-Jin;Kim, Seung-Yeon;Cho, Choong-Ho;Lee, Hyong-Woo;Ryu, Seung-Wan
    • Journal of KIISE:Information Networking
    • /
    • v.37 no.4
    • /
    • pp.294-300
    • /
    • 2010
  • In this paper, We analysis of effective coverage extension for Tri-sector cellular systems using Multi-hop Relay based on IEEE802.16j system. In the proposed international standard of IEEE 802.16j MMR (Mobile Multi-hop Relay) use of the omni-directional antenna, 3-sector and 6-sector antenna is considered to Base Station and Relay Station. Omni-directional antenna service can offer as all directions but a throughput decreases due to the signal interference of near Relay Stations. In the directional antenna, cause of an interference with the base station which it arranges an antenna so that a beam can have the direct and does with neighbor Base Station and Relay Station can be reduced interference, therefore the effective throughput is higher than the omni-directional antenna system. But, In case of Base Station and Relay Station use the directional antenna, the efficiency which the directional antenna has the Co-channel interference due to in the different cell by the channel reuse is decreased. In this study, we propose the structure of arranging the Base Station and Relay Station having the directional antenna in the NBTC, WBTC antenna in a multi-tier. It compared and analyzed with the mode that the multi-hop Relay Station has the omni-directional antenna, Relay Station are used the NBTC antenna and the WBTC antenna system also, We analyze a relation between the performance degradation and the cell coverage extension which it follows because the number of hop in the multi-hop Relay Station.

Omni-directional Vision SLAM using a Motion Estimation Method based on Fisheye Image (어안 이미지 기반의 움직임 추정 기법을 이용한 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Dai, Yanyan;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.8
    • /
    • pp.868-874
    • /
    • 2014
  • This paper proposes a novel mapping algorithm in Omni-directional Vision SLAM based on an obstacle's feature extraction using Lucas-Kanade Optical Flow motion detection and images obtained through fish-eye lenses mounted on robots. Omni-directional image sensors have distortion problems because they use a fish-eye lens or mirror, but it is possible in real time image processing for mobile robots because it measured all information around the robot at one time. In previous Omni-Directional Vision SLAM research, feature points in corrected fisheye images were used but the proposed algorithm corrected only the feature point of the obstacle. We obtained faster processing than previous systems through this process. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we remove the feature points of the floor surface using a histogram filter, and label the candidates of the obstacle extracted. Third, we estimate the location of obstacles based on motion vectors using LKOF. Finally, it estimates the robot position using an Extended Kalman Filter based on the obstacle position obtained by LKOF and creates a map. We will confirm the reliability of the mapping algorithm using motion estimation based on fisheye images through the comparison between maps obtained using the proposed algorithm and real maps.