• Title/Summary/Keyword: Range data

Search Result 11,487, Processing Time 0.038 seconds

Fast Local Indoor Map Building Using a 2D Laser Range Finder (2차원 레이저 레이진 파이더를 이용한 빠른 로컬 실내 지도 제작)

  • Choi, Ung;Koh, Nak-Yong;Choi, Jeong-Sang
    • Proceedings of the Korean Society of Machine Tool Engineers Conference
    • /
    • 1999.10a
    • /
    • pp.99-104
    • /
    • 1999
  • This paper proposes an efficient method constructing a local map using the data of a scanning laser range finder. A laser range finder yields distance data of polar form, that is, distance data corresponding to every scanning directions. So, the data consists of directional angle and distance. We propose a new method to find a line fitting with a set of such data. The method uses Log-Hough Transformation. Usually, map building from these data requires some transformations between different coordinate systems. The new method alleviates such complication. Also, the method simplifies computation for line recognition and eliminates the slope quantization problems inherent in the classical Cartesian Hough transform method. To show the efficiency of the proposed method, it is applied to find a local map using the data from a laser range finder PLS(Proximity Laser Scanner, made by SICK).

  • PDF

Study on Tendency of Echo Sounding by Turbidity (탁도에 따른 Echo Sounder 측심특성연구)

  • Kim, Yong-Bo;Kim, Jin-Hu
    • Proceedings of the Korean Society of Marine Engineers Conference
    • /
    • 2005.11a
    • /
    • pp.148-149
    • /
    • 2005
  • In this study, among the precision decline main causes of sounding, I suggested the characteristics of sounding data acquired by echo sounder with increasing of turbidity For this, I acquired sounding data by inputting turbidity inducer artificially in artificial water tank. And then achieved regression analysis. Conclusion are as following : Sounding Capabilities can be divided into three ranges according to the turbidity : normal range, critical range and the range where data can not be obtained by an echo sounder

  • PDF

Effect of dynamic range consumption for microholographic data storage system (마이크로 홀로그래픽 시스템에서 미디어의 소진효과)

  • Kim, Do-Hyung;Min, Cheol-Ki;Cho, Jang-Hyun;Kim, Nak-Yeong;Park, Kyoung-Su;Park, No-Cheol;Yang, Hyun-Seok;Park, Young-Pil
    • Transactions of the Society of Information Storage Systems
    • /
    • v.7 no.1
    • /
    • pp.31-35
    • /
    • 2011
  • In microholographic data storage system (MDSS), compact recording is required to achieve high capacity.[1] When the data is recorded, neighbor monomer is also affected by reaction at the focal point.[2,3] This unintended process caused more monomer consumption and degradation of total capacity. To avoid this extra consumption of dynamic range, it is required to define the effective dynamic range for MDSS. In this paper, we experimentally investigate the relation between dynamic range consumption and micro grating formation. Dynamic range consumption was monitored by real time read-out system. Micrograting was recorded with different consumption ratio and compared by diffraction efficiency of track direction. Finally, we define suitable dynamic range for MDSS.

Investigation of Airborne LIDAR Intensity data

  • Chang Hwijeong;Cho Woosug
    • Proceedings of the KSRS Conference
    • /
    • 2004.10a
    • /
    • pp.646-649
    • /
    • 2004
  • LiDAR(Light Detection and Ranging) system can record intensity data as well as range data. Recently, LiDAR intensity data is widely used for landcover classification, ancillary data of feature extraction, vegetation species identification, and so on. Since the intensity return value is associated with several factors, same features is not consistent for same flight or multiple flights. This paper investigated correlation between intensity and range data. Once the effects of range was determined, the single flight line normalization and the multiple flight line normalization was performed by an empirical function that was derived from relationship between range and return intensity

  • PDF

Comorbidity Adjustment in Health Insurance Claim Database (건강보험청구자료에서 동반질환 보정방법)

  • Kim, Kyoung Hoon
    • Health Policy and Management
    • /
    • v.26 no.1
    • /
    • pp.71-78
    • /
    • 2016
  • The value of using health insurance claim database is continuously rising in healthcare research. In studies where comorbidities act as a confounder, comorbidity adjustment holds importance. Yet researchers are faced with a myriad of options without sufficient information on how to appropriately adjust comorbidity. The purpose of this study is to assist in selecting an appropriate index, look back period, and data range for comorbidity adjustment. No consensus has been formed regarding the appropriate index, look back period and data range in comorbidity adjustment. This study recommends the Charlson comorbidity index be selected when predicting the outcome such as mortality, and the Elixhauser's comorbidity measures be selected when analyzing the relations between various comorbidities and outcomes. A longer look back period and inclusion of all diagnoses of both inpatient and outpatient data led to increased prevalence of comorbidities, but contributed little to model performance. Limited data range, such as the inclusion of primary diagnoses only, may complement limitations of the health insurance claim database, but could miss important comorbidities. This study suggests that all diagnoses of both inpatients and outpatients data, excluding rule-out diagnosis, be observed for at least 1 year look back period prior to the index date. The comorbidity index, look back period, and data range must be considered for comorbidity adjustment. To provide better guidance to researchers, follow-up studies should be conducted using the three factors based on specific diseases and surgeries.

AUTOMATIC ROAD NETWORK EXTRACTION. USING LIDAR RANGE AND INTENSITY DATA

  • Kim, Moon-Gie;Cho, Woo-Sug
    • Proceedings of the KSRS Conference
    • /
    • 2005.10a
    • /
    • pp.79-82
    • /
    • 2005
  • Recently the necessity of road data is still being increased in industrial society, so there are many repairing and new constructions of roads at many areas. According to the development of government, city and region, the update and acquisition of road data for GIS (Geographical Information System) is very necessary. In this study, the fusion method with range data(3D Ground Coordinate System Data) and Intensity data in stand alone LiDAR data is used for road extraction and then digital image processing method is applicable. Up to date Intensity data of LiDAR is being studied. This study shows the possibility method for road extraction using Intensity data. Intensity and Range data are acquired at the same time. Therefore LiDAR does not have problems of multi-sensor data fusion method. Also the advantage of intensity data is already geocoded, same scale of real world and can make ortho-photo. Lastly, analysis of quantitative and quality is showed with extracted road image which compare with I: 1,000 digital map.

  • PDF

An Efficient Data Structure to Obtain Range Minima in Constant Time in Constructing Suffix Arrays (접미사 배열 생성 과정에서 구간 최소간 위치를 상수 시간에 찾기 위한 효율적인 자료구조)

  • 박희진
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.31 no.3_4
    • /
    • pp.145-151
    • /
    • 2004
  • We present an efficient data structure to obtain the range minima in an away in constant time. Recently, suffix ways are extensively used to search DNA sequences fast in bioinformatics. In constructing suffix arrays, solving the range minima problem is necessary When we construct suffix arrays, we should solve the range minima problem not only in a time-efficient way but also in a space-efficient way. The reason is that DNA sequences consist of millions or billions of bases. Until now, the most efficient data structure to find the range minima in an way in constant time is based on the method that converts the range minima problem in an array into the LCA (Lowest Common Ancestor) problem in a Cartesian tree and then converts the LCA problem into the range minima problem in a specific array. This data structure occupies O( n) space and is constructed in O(n) time. However since this data structure includes intermediate data structures required to convert the range minima problem in an array into other problems, it requires large space (=13n) and much time. Our data structure is based on the method that directly solves the range minima problem. Thus, our data structure requires small space (=5n) and less time in practice. As a matter of course, our data structure requires O(n) time and space theoretically.

Grouping Algorithms of Zigbee Nodes for Efficient Data Transmission to Long Range (효율적인 원거리 데이터 전송을 위한 Zigbee 노드들의 그룹화 알고리즘)

  • Woo, Sung-Je;Shin, Bok-Deok
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.61 no.4
    • /
    • pp.632-638
    • /
    • 2012
  • ZigBee network, based on PHY, MAC layer provides a specification for a suite of high level communication protocols using small, low-power digital radio based on an IEEE 802.15.4 standard. Meshing is a type of daisy chaining from one device to another. This technique allows the short range of an individual node to be expanded and multiplied, covering a much larger area. Each wireless technology that makes it to market serves a special purpose or function. Zigbee provides short-range connectivity in what is called a personal-area network (PAN). Within ZigBee PAN coordinator as manages an entire ZigBee network, the short range of frequency band was only selected because the technology allows typically less than 100 kbp or ZigBee troubles in retransmission processing and delaying data tranmission works to create unproductive condition of work. This research was proposed the method, based on short range frequency of zigBee nodes enable to long range of remote data transmission with specific algorithm tools.

A new fractal image decoding algorithm with fast convergence speed (고속 수렴 속도를 갖는 새로운 프랙탈 영상 복호화 알고리듬)

  • 유권열;문광석
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.34S no.8
    • /
    • pp.74-83
    • /
    • 1997
  • In this paper, we propose a new fractal image decoding algorithm with fast convergence speed by using the data dependence and the improved initial image estimation. Conventional method for fractal image decoding requires high-degrdd computational complexity in decoding process, because of iterated contractive transformations applied to whole range blocks. On proposed method, Range of reconstruction imagte is divided into referenced range and data dependence region. And computational complexity is reduced by application of iterated contractive transformations for the referenced range only. Data dependence region can be decoded by one transformations when the referenced range is converged. In addition, more exact initial image is estimated by using bound () function in case of all, and an initial image more nearer to a fixed point is estimated by using range block division estimation. Consequently, the convergence speed of reconstruction iamge is improved with 40% reduction of computational complexity.

  • PDF

Regularized Surface Smoothing for Enhancement of Range Data (거리영상 개선을 위한 정칙화 기반 표면 평활화기술)

  • 기현종;신정호;백준기
    • Proceedings of the IEEK Conference
    • /
    • 2003.07e
    • /
    • pp.1903-1906
    • /
    • 2003
  • This paper proposes an adaptive regularized noise smoothing algorithm for range image using the area decreasing flow method, which can preserve meaningful edges during the smoothing process. Although the area decreasing flow method can easily smooth Gaussian noise, it has two problems; ⅰ) it is not easy to remove impulsive noise from observed range data, and ⅱ) it is also difficult to remove noise near edge when the adaptive regularization is used. In the paper, therefore, the second smoothness constraint is addtionally incorporated into the existing regularization algorithm, which minimizes the difference between the median filtered data and the estimated data. As a result, the Proposed algorithm can effectively remove the noise of dense range data with edge preserving.

  • PDF