• Title/Summary/Keyword: Space camera

Search Result 1,059, Processing Time 0.022 seconds

Development of the Camera System for Total Solar Eclipse

  • Kim, Jihun;Choi, Seonghwan;Park, Jongyeob;Bong, Su-Chan;Jang, Bi-Ho;Park, Sung-Joon;Yang, Heesu;Park, Young-Deuk;Cho, Kyungsuk
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.42 no.2
    • /
    • pp.84.3-85
    • /
    • 2017
  • Korea Astronomy and Space Science Institute (KASI) has been developing the Camera System for the Total Solar Eclipse (TSE) observation. In 2016 we have assembled a simple camera system consisting of a commercial camera lens, a polarizer, bandpass filters, and a Canon camera to observe the solar corona during the Total Solar Eclipse in Indonesia. For 2017 TSE observation, we have studied and adapted the compact coronagraph design proposed by NASA. The compact coronagraph design dramatically reduces the volume and weight, and used for TSE observation. The camera is used to test and verify key components including function of bandpass filter, polarizer, and CCD during observing the Total Solar Eclipse. In this poster we focus on optical engineering works including designing, analyzing, testing, and building for the TSE observation.

  • PDF

REFOCUSING FOR ON-ORBIT MTF COMPENSATION OF REMOTE SENSING CAMERA

  • Jang Hong-Sul;Jeong Dae-Jun;Lee Seunghoon
    • Proceedings of the KSRS Conference
    • /
    • 2005.10a
    • /
    • pp.601-603
    • /
    • 2005
  • Refocusing methods are used to compensate optical performance degradation of high resolution satellite camera during on-orbit operation. Due to mechanical vibration during launch and thermal vacuum environment of space where camera is exposed, the alignment of optical system may have error. The focusing error is dominant of misalignment and caused by the de-space error of secondary mirror of catoptric camera, which is most sensitive to vibration and space environment. The high resolution camera of SPOT, Pleiades and KOMPSAT2 have refocusing device to adjust focusing during orbital operation while QuickBird of US does not use on orbit refocusing method. For the Korsch type optical configuration which is preferred for large aperture space remote sensing camera, secondary mirror and folding mirror are available as refocusing element.

  • PDF

Construction of Korean Space Weather Prediction Center: SCINTMON and All-Sky Camera

  • Kwak, Young-Sil;Hwang, Jung-A;Cho, Kyung-Suk;Bong, Su-Chan;Choi, Seong-Hwan;Park, Young-Deuk;Kyeong, Jae-Mann;Park, Yoon-Ho
    • Bulletin of the Korean Space Science Society
    • /
    • 2008.10a
    • /
    • pp.33.1-33.1
    • /
    • 2008
  • As a part of the construction of Korean Space Weather Prediction Center (K-SWPC), Korea Astronomy and Space Science Institute (KASI) installed a Scintillation Monitor (SCINTMON) and an All-Sky Camera to observe upper atmospheric/ionospheric phenomena. The SCINTMON is installed in KASI building in Daejeon in cooperation with Cornell university and is monitoring the ionospheric scintillations on GPS L-band signals. All-Sky Camera is installed at Mt. Bohyun in Youngcheon in cooperation with Korea Polar Research Institute. It is used to take the photograph for upper atmospheric layer through appropriate filters with specific airglow or auroral emission wavelengths and to observe upper atmospheric disturbance, propagation of gravity wave and aurora. The integrated data from the instruments including SCINTMON and All-Sky Camera will be used for giving nowcast on the space weather and making confidential forecast based on some space weather prediction models.

  • PDF

Rendezvous Mission to Apophis: V. Wide-Angle Camera Science

  • JeongAhn, Youngmin;Lee, Hee-Jae;Jeong, Minsup;Kim, Myung-Jin;Choi, Jin;Moon, Hong-Kyu;Choi, Young-Jun
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.46 no.2
    • /
    • pp.59.1-59.1
    • /
    • 2021
  • The Korean spacecraft for the exploration of Apophis will be equipped with an optical navigation camera with a wide-angle lens. The major purpose of the wide-angle camera is to capture imagery during the rendezvous phase in order to determine the spacecraft's position and the pointing direction relative to the asteroid Apophis. Two potential sciences, however, can be achieved by the wide-angle camera: (1) to measure the high-order gravity terms, and (2) to capture possible ejecting small particles. In this presentation, we will discuss instrument specification and operation scenario required to accomplish the given science objectives.

  • PDF

Design of Hardware Interface for the Otto Struve 2.1m Telescope

  • Oh, Hee-Young;Park, Won-Kee;choi, Chang-Su;Kim, Eun-Bin;Nguyen, Huynh Anh Le;Lim, Ju-Hee;Jeong, Hyeon-Ju;Pak, Soo-Jong;Im, Myung-Shin
    • Bulletin of the Korean Space Science Society
    • /
    • 2009.10a
    • /
    • pp.25.3-25.3
    • /
    • 2009
  • To search for the quasars at z > 7 in early universe, we are developing a optical camera which has a $1k\times1k$ deep depletion CCD chip, with later planned upgrade to HAWAII-2RG infrared array. We are going to attach the camera to the cassegrain focus of Otto Struve 2.1m telescope at McDonald observatory of University of Texas at Austin, USA. We present the design of a hardware interface to attach the CCD camera to the telescope. It consists of focal reducer, filter wheel, and guiding camera. Focal reducer is needed to reduce the long f-ratio (f/13.7) down to about 4 for wide field of view. The guiding camera design is based on that of DIAFI offset guider which developed for the McDonald 2.7m telescope.

  • PDF

Point Cloud Generation Method Based on Lidar and Stereo Camera for Creating Virtual Space (가상공간 생성을 위한 라이다와 스테레오 카메라 기반 포인트 클라우드 생성 방안)

  • Lim, Yo Han;Jeong, In Hyeok;Lee, San Sung;Hwang, Sung Soo
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.11
    • /
    • pp.1518-1525
    • /
    • 2021
  • Due to the growth of VR industry and rise of digital twin industry, the importance of implementing 3D data same as real space is increasing. However, the fact that it requires expertise personnel and huge amount of time is a problem. In this paper, we propose a system that generates point cloud data with same shape and color as a real space, just by scanning the space. The proposed system integrates 3D geometric information from lidar and color information from stereo camera into one point cloud. Since the number of 3D points generated by lidar is not enough to express a real space with good quality, some of the pixels of 2D image generated by camera are mapped to the correct 3D coordinate to increase the number of points. Additionally, to minimize the capacity, overlapping points are filtered out so that only one point exists in the same 3D coordinates. Finally, 6DoF pose information generated from lidar point cloud is replaced with the one generated from camera image to position the points to a more accurate place. Experimental results show that the proposed system easily and quickly generates point clouds very similar to the scanned space.

Agent-based Automatic Camera Placement for Video Surveillance Systems (영상 감시 시스템을 위한 에이전트 기반의 자동화된 카메라 배치)

  • Burn, U-In;Nam, Yun-Young;Cho, We-Duke
    • Journal of Internet Computing and Services
    • /
    • v.11 no.1
    • /
    • pp.103-116
    • /
    • 2010
  • In this paper, we propose an optimal camera placement using agent-based simulation. To derive importance of space and to cover the space efficiently, we accomplished an agent-based simulation based on classification of space and pattern analysis of moving people. We developed an agent-based camera placement method considering camera performance as well as space priority extracted from path finding algorithms. We demonstrate that the method not only determinates the optimal number of cameras, but also coordinates the position and orientation of the cameras with considering the installation costs. To validate the method, we compare simulation results with videos of real materials and show experimental results simulated in a specific space.

SPECIFIC ANALYSIS OF WEB CAMERA AND HIGH RESOLUTION PLANETARY IMAGING (웹 카메라의 특성 분석 및 고해상도 행성촬영)

  • Park, Young-Sik;Lee, Dong-Ju;Jin, Ho;Han, Won-Yong;Park, Jang-Hyun
    • Journal of Astronomy and Space Sciences
    • /
    • v.23 no.4
    • /
    • pp.453-464
    • /
    • 2006
  • Web camera is usually used for video communication between PC, it has small sensing area, cannot using long exposure application, so that is insufficient for astronomical application. But web camera is suitable for bright planet, moon, it doesn't need long exposure time. So many amateur astronomer using web camera for planetary imaging. We used ToUcam manufactured by Phillips for planetary imaging and Registax commercial program for a video file combining. And then, we are measure a property of web camera, such as linearity, gain that is usually using for analysis of CCD performance. Because of using combine technic selected high quality image from video frame, this method on take higher resolution planetary imaging than one shot image by film, digital camera and CCD. We describe a planetary observing method and a video frame combine method.

Appearance Based Object Identification for Mobile Robot Localization in Intelligent Space with Distributed Vision Sensors

  • Jin, TaeSeok;Morioka, Kazuyuki;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.2
    • /
    • pp.165-171
    • /
    • 2004
  • Robots will be able to coexist with humans and support humans effectively in near future. One of the most important aspects in the development of human-friendly robots is to cooperation between humans and robots. In this paper, we proposed a method for multi-object identification in order to achieve such human-centered system and robot localization in intelligent space. The intelligent space is the space where many intelligent devices, such as computers and sensors, are distributed. The Intelligent Space achieves the human centered services by accelerating the physical and psychological interaction between humans and intelligent devices. As an intelligent device of the Intelligent Space, a color CCD camera module, which includes processing and networking part, has been chosen. The Intelligent Space requires functions of identifying and tracking the multiple objects to realize appropriate services to users under the multi-camera environments. In order to achieve seamless tracking and location estimation many camera modules are distributed. They causes some errors about object identification among different camera modules. This paper describes appearance based object representation for the distributed vision system in Intelligent Space to achieve consistent labeling of all objects. Then, we discuss how to learn the object color appearance model and how to achieve the multi-object tracking under occlusions.

Engineering run of CQUEAN

  • Park, Won-Kee;Kim, Eun-Bin;Jeong, Hyeon-Ju;Kim, Jin-Young;Lim, Ju-Hee;Choi, Chang-Su;Jeon, Yi-Seul;Pak, Soo-Jong;Im, Myung-Shin
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.35 no.2
    • /
    • pp.62.1-62.1
    • /
    • 2010
  • CQUEAN (Camera for QUasars in EArly uNiverse) is an optical CCD camera system that consists of a science CCD camera, a guide CCD camera, and seven filters. In addition, a focal reducer is installed in front of the science camera to secure a larger field of view for the system. Engineering run of the system was carried out from Aug. 10, 2010 to Aug. 17, 2010, with 2.1m Otto Struve telescope at McDonald Observatory, USA, from which we investigated the characteristics and performance of the system. Bias and dark images were taken under various temperature conditions to examine the system behavior, and both twilight and dome flat images were obtained to investigate the appropriate preprocessing procedures of the data. Crude initial estimate indicated one hour integration would reach limiting magnitude of 24.2 in i-band with S/N ratio of 5, with CQUEAN at 2.1m telescope. The detailed results of the engineering run will be presented.

  • PDF