• Title/Summary/Keyword: omnidirectional

Search Result 291, Processing Time 0.029 seconds

Omnidirectional Circularly Polarized Antenna Using Zeroth-Order Resonance (영차 공진을 이용한 전방향성 원형 편파 안테나)

  • Park, Byung-Chul;Lee, Jeong-Hae
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.20 no.8
    • /
    • pp.806-812
    • /
    • 2009
  • In this paper, the omnidirectional circularly polarized(CP) antenna using arc-shaped mushroom structure with curved branch is proposed. To obtain a vertical polarization and an omnidirectional radiation pattern, the CP antenna uses zeroth-order resonance(ZOR) mode of composite right and left handed(CRLH) transmission line. The horizontal polarization is achieved by the curved branches. Also, the spacing between curved branch and arc-shaped mushroom structure gives the $90^{\circ}$ phase difference between vortical and horizontal polarization. The proposed antenna, therefore, has an omnidirectional CP radiation pattern In the azimuthal plane. The electrical size of the proposed antenna is reduced by 38%, compared with that of the previously presented omnidirectional CP antenna. In addition, the CP antenna is simply designed without $90^{\circ}$ phase shifter and dual feed line. The proposed antenna uses a Bazooka balun for good impedance matching and radiation pattern. To improve 3 dB axial ratio in XY plane, the designed antenna is optimized. After optimization, the measured 3 dB axial ratio in XY plane is observed in $86{\sim}282^{\circ}$.

Catadioptric Omnidirectional Stereo Imaging System and Reconstruction of 3-dimensional Coordinates (Catadioptric 전방향 스테레오 영상시스템 및 3차원 좌표 복원)

  • Kim, Soon-Cheol;Yi, Soo-Yeong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.6
    • /
    • pp.4108-4114
    • /
    • 2015
  • An image acquisition by using an optical mirror is called as a catadioptric method. The catadioptric imaging method is generally used for acquisition of 360-degree all directional visual information in an image. An exemplar omnidirectional optical mirror is the bowl-shaped hyperbolic mirror. In this paper, a single camera omnidirectional stereo imaging method is studied with an additional concave lens. It is possible to obtain 3 dimensional coordinates of environmental objects from the omnidirectional stereo image by matching the stereo image having different view points. The omnidirectional stereo imaging system in this paper is cost-effective and relatively easy for correspondence matching because of consistent camera intrinsic parameters in the stereo image. The parameters of the imaging system are extracted through 3-step calibration and the performance for 3-dimensional coordinates reconstruction is verified through experiments in this paper. Measurable range of the proposed imaging system is also presented by depth-resolution analysis.

Performance Analysis on View Synthesis of 360 Videos for Omnidirectional 6DoF in MPEG-I (MPEG-I의 6DoF를 위한 360 비디오 가상시점 합성 성능 분석)

  • Kim, Hyun-Ho;Kim, Jae-Gon
    • Journal of Broadcast Engineering
    • /
    • v.24 no.2
    • /
    • pp.273-280
    • /
    • 2019
  • 360 video is attracting attention as immersive media with the spread of VR applications, and MPEG-I (Immersive) Visual group is actively working on standardization to support immersive media experiences with up to six degree of freedom (6DoF). In virtual space of omnidirectional 6DoF, which is defined as a case of degree of freedom providing 6DoF in a restricted area, looking at the scene at any viewpoint of any position in the space requires rendering the view by synthesizing additional viewpoints called virtual omnidirectional viewpoints. This paper presents the performance results on view synthesis and their analysis, which have been done as exploration experiments (EEs) of omnidirectional 6DoF in MPEG-I. In other words, experiment results on view synthesis in various aspects of synthesis conditions such as the distances between input views and virtual view to be synthesized and the number of input views to be selected from the given set of 360 videos providing omnidirectional 6DoF are presented.

Optical Design of a Modified Catadioptric Omnidirectional Optical System for a Capsule Endoscope to Image Simultaneously Front and Side Views on a RGB/NIR CMOS Sensor (RGB/NIR CMOS 센서에서 정면 영상과 측면 영상을 동시에 결상하는 캡슐 내시경용 개선된 반사굴절식 전방위 광학계의 광학 설계)

  • Hong, Young-Gee;Jo, Jae Heung
    • Korean Journal of Optics and Photonics
    • /
    • v.32 no.6
    • /
    • pp.286-295
    • /
    • 2021
  • A modified catadioptric omnidirectional optical system (MCOOS) using an RGB/NIR CMOS sensor is optically designed for a capsule endoscope with the front field of view (FOV) in visible light (RGB) and side FOV in visible and near-infrared (NIR) light. The front image is captured by the front imaging lens system of the MCOOS, which consists of an additional three lenses arranged behind the secondary mirror of the catadioptric omnidirectional optical system (COOS) and the imaging lens system of the COOS. The side image is properly formed by the COOS. The Nyquist frequencies of the sensor in the RGB and NIR spectra are 90 lp/mm and 180 lp/mm, respectively. The overall length of 12 mm, F-number of 3.5, and two half-angles of front and side half FOV of 70° and 50°-120° of the MCOOS are determined by the design specifications. As a result, a spatial frequency of 154 lp/mm at a modulation transfer function (MTF) of 0.3, a depth of focus (DOF) of -0.051-+0.052 mm, and a cumulative probability of tolerance (CPT) of 99% are obtained from the COOS. Also, the spatial frequency at MTF of 170 lp/mm, DOF of -0.035-0.051 mm, and CPT of 99.9% are attained from the front-imaging lens system of the optimized MCOOS.

Isotropic Configurations of Omnidirectional Mobile Robots with Three Caster Wheels

  • Kim, Sung-Bok;Lee, Jae-Young;Kim, Hyung-Gi
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2066-2071
    • /
    • 2003
  • In this paper, we identify the isotropic configurations of an omnidirectional mobile robot with three caster wheels, depending on the selection of actuated joints. First, We obtain the kinematic model of a caster wheeled omnidirectional mobile robot(COMR) without matrix inversion. For a given task velocity, the instantaneous motion of each wheel is decomposed into two orthogonal instantaneous motions of the steering and the rotating joints. Second, with the characteristic length introduced, we derive the isotropy conditions of a COMR having $n({\ge}3)$ actuated joints, which are imposed on two Jacobian matrices, $A{\in}R^{n{\times}3}$ and $B{\in}R^{6{\times}6}$. Under the condition of $B{\propto}I_6$, three caster wheels should have identical structure with the length of the steering link equal to the radius of the wheel. Third, depending on the selection of actuated joints, we derive the conditions for $A^t$ $A{\propto}I_3$ and identify the isotropic configurations of a COMR. All possible actuation sets with different number of actuated joints and different combination of rotating and steering joins are considered.

  • PDF

Using Omnidirectional Images for Semi-Automatically Generating IndoorGML Data

  • Claridades, Alexis Richard;Lee, Jiyeong;Blanco, Ariel
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.5
    • /
    • pp.319-333
    • /
    • 2018
  • As human beings spend more time indoors, and with the growing complexity of indoor spaces, more focus is given to indoor spatial applications and services. 3D topological networks are used for various spatial applications that involve navigation indoors such as emergency evacuation, indoor positioning, and visualization. Manually generating indoor network data is impractical and prone to errors, yet current methods in automation need expensive sensors or datasets that are difficult and expensive to obtain and process. In this research, a methodology for semi-automatically generating a 3D indoor topological model based on IndoorGML (Indoor Geographic Markup Language) is proposed. The concept of Shooting Point is defined to accommodate the usage of omnidirectional images in generating IndoorGML data. Omnidirectional images were captured at selected Shooting Points in the building using a fisheye camera lens and rotator and indoor spaces are then identified using image processing implemented in Python. Relative positions of spaces obtained from CAD (Computer-Assisted Drawing) were used to generate 3D node-relation graphs representing adjacency, connectivity, and accessibility in the study area. Subspacing is performed to more accurately depict large indoor spaces and actual pedestrian movement. Since the images provide very realistic visualization, the topological relationships were used to link them to produce an indoor virtual tour.

Watermark Extraction Method of Omnidirectional Images Using CNN (CNN을 이용한 전방위 영상의 워터마크 추출 방법)

  • Moon, Won-Jun;Seo, Young-Ho;Kim, Dong-Wook
    • Journal of Broadcast Engineering
    • /
    • v.25 no.2
    • /
    • pp.151-156
    • /
    • 2020
  • In this paper, we propose a watermark extraction method of omnidirectional images using CNN (Convolutional Neural Network) to improve the extracted watermark accuracy of the previous deterministic method that based on algorithm. This CNN consists of a restoration process of extracting watermarks by correcting distortion during omnidirectional image generation and/or malicious attacks, and a classification process of classifying which watermarks are extracted watermarks. Experiments with various attacks confirm that the extracted watermarks are more accurate than the previous methods.

Analysis of the Electrical and Optical Properties in Omnidirectional LED Bulbs by Energy Star (Energy Star 기준에 따른 Omnidirectional LED 벌브의 전기적 광학적 특성 분석)

  • Kim, Yu-Sin;Bae, Ho-June;Kim, Gi-Hoon;Kim, Hyun-Sik;Song, Sang-Bin
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.25 no.9
    • /
    • pp.750-754
    • /
    • 2012
  • An LED (light emitting diode) has the advantages of lower power consumption, energy saving, high efficiency, long lifetime, and environmental friendliness so that it has been getting the spotlight as a next-generation light source. Thus, the application range of an LED has been extended to various fields including indoor and outdoor lighting. Recently, the high efficient LED lighting has been developed, an LED has been extended its market rapidly every year and is expected to replace the general light source within near future. In this study were measured electrical and optical properties for 6 types of LED bulbs which are being developed to replace the general incandescent lamps, and were analysed under the standard of the omnidirectional lamp required by the Energy Star.

An Omnidirectional Planar Antenna with Four Stepped L-shape slots (4개의 계단형 L-슬롯 구조를 갖는 전방향성 평면 안테나)

  • Nam, Sung-Soo;Lee, Hong-Min
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.1 no.3
    • /
    • pp.3-8
    • /
    • 2008
  • In this paper, an planar antenna which has omnidirectional radiation pattern in H-plane and low profile is proposed. By adding inductance elements of an ENG shell structure, a capacitance element of an electrically small antenna is easily achieved with impedance matching. An ENG shell structure is consist of a inductive loading structure which has symmetrical four stepped L-shape slots. The simulated result shows, the impedance bandwidth of the proposed antenna is 150MHz (2.5 ~ 2.65GHz). The simulated maximum radiation gain of proposed antenna is 1.12 dBi at center frequency 2.56GHz. Omnidirectional radiation pattern is achieved. The proposed antenna will be applied to wireless lan access point system.

  • PDF

MPEG Omnidirectional Media Format (OMAF) for 360 Media (360 미디어를 위한 MPEG Omnidirectional Media Format (OMAF) 표준 기술)

  • Oh, Sejin
    • Journal of Broadcast Engineering
    • /
    • v.22 no.5
    • /
    • pp.600-607
    • /
    • 2017
  • Virtual Reality (VR) has lately gained significant attention primarily driven by the recent market availability of consumer devices, such as mobile phone-based Head Mounted Displays (HMDs). Apart from classic gaming use cases, the delivery of $360^{\circ}$ video is considered as another major application and is expected to be ubiquitous in the near future. However, the delivery and decoding of high-resolution $360^{\circ}$ videos in desirable quality is a challenging task due to network limitations and constraints on available end device decoding and processing. In this paper, we focus on aspects of $360^{\circ}$ video streaming and provide an overview and discussion of possible solutions as well as considerations for future VR video streaming applications. This paper mainly focuses on the status of the standardization activities, Omnidirectional MediA Format (OMAF), to support interoperable $360^{\circ}$ video streaming services. More concretely, MPEG's ongoing work for OMA aims at harmonization of VR video platforms and applications. The paper also discusses the integration in MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH), which is considered as $360^{\circ}$ video streaming services with OMAF content. In context of the general OMAF service architecture.