• 제목/요약/키워드: Data Density

검색결과 5,379건 처리시간 0.033초

레이더검지기의 차량궤적 정보기반의 고속도로 밀도산출방법에 관한 비교 (Comparison of Estimation Methods for the Density on Expressways Using Vehicular Trajectory Data from a Radar Detector)

  • 김상구;한음;이환필;김해;윤일수
    • 한국도로학회논문집
    • /
    • 제18권5호
    • /
    • pp.117-125
    • /
    • 2016
  • PURPOSES : The density in uninterrupted traffic flow facilities plays an important role in representing the current status of traffic flow. For example, the density is used for the primary measures of effectiveness in the capacity analysis for freeway facilities. Therefore, the estimation of density has been a long and tough task for traffic engineers for a long time. This study was initiated to evaluate the performance of density values that were estimated using VDS data and two traditional methods, including a method using traffic flow theory and another method using occupancy by comparing the density values estimated using vehicular trajectory data generated from a radar detector. METHODS : In this study, a radar detector which can generate very accurate vehicular trajectory within the range of 250 m on the Joongbu expressway near to Dongseoul tollgate, where two VDS were already installed. The first task was to estimate densities using different data and methods. Thus, the density values were estimated using two traditional methods and the VDS data on the Joongbu expressway. The density values were compared with those estimated using the vehicular trajectory data in order to evaluate the quality of density estimation. Then, the relationship between the space mean speed and density were drawn using two sets of densities and speeds based on the VDS data and one set of those using the radar detector data. CONCLUSIONS : As a result, the three sets of density showed minor differences when the density values were under 20 vehicles per km per lane. However, as the density values become greater than 20 vehicles per km per lane, the three methods showed a significant difference among on another. The density using the vehicular trajectory data showed the lowest values in general. Based on the in-depth study, it was found out that the space mean speed plays a critical role in the calculation of density. The speed estimated from the VDS data was higher than that from the radar detector. In order to validate the difference in the speed data, the traffic flow models using the relationships between the space mean speed and the density were carefully examined in this study. Conclusively, the traffic flow models generated using the radar data seems to be more realistic.

반복적 비선형역산에 의한 2차원 지질구조의 중력자료 해석 연구 (A Study on Interpretation of Gravity Data on Two-Dimensional Geologic Structures by Iterative Nonlinear Inverse)

  • 고진석;양승진
    • 자원환경지질
    • /
    • 제27권5호
    • /
    • pp.479-489
    • /
    • 1994
  • In this paper, the iterative least-squares inversion method is used to determine shapes and density contrasts of 2-D structures from the gravity data. The 2-D structures are represented by their cross-sections of N-sided polygons with density contrasts which are constant or varying with depth. Gravity data are calculated by theoretical formulas for the above structure models. The data are considered as observed ones and used for inversions. The inversions are performed by the following processes: I) polygon's vertices and density contrast are initially assumed, 2) gravity are calculated for the assumed model and error between the true (observed) and calculated gravity are determined, 3) new vertices and density contrast are determined from the error by using the damped least-squares inversion method, and 4) final model is determined when the error is very small. Results of this study show that the shape and density contrast of each model are accurately determined when the density contrast is constant or vertical density gradient is known. In case where the density gradient is unknown, the inversion gives incorrect results. But the shape and density gradient of the model are determined when the surface density contrast is known.

  • PDF

남한지역 검층밀도 자료의 특성 분석 (Frequency Distribution Characteristics of Formation Density Derived from Log and Core Data throughout the Southern Korean Peninsula)

  • 김영화;김기환;김종만;황세호
    • 지질공학
    • /
    • 제25권2호
    • /
    • pp.281-290
    • /
    • 2015
  • 남한 지역에서 수행된 검층밀도 자료를 수집하고 코어밀도 자료와 비교 분석하였다. 먼저 코어밀도와 검층밀도 자료의 비교로부터 검층밀도가 비이상적으로 낮은 현상이 얻어졌으며 이는 소선원 검층밀도 자료에서의 비이상적으로 낮은 밀도와 연관되어 있음이 밝혀졌다. 표준선원과 소선원 자료간의 큰 밀도 차이를 비롯하여 표준선원 검층밀도와 코어밀도 간의 상관성 비교에서 나타난 분포곡선의 형태, 평균값, 표준편차 등 모든 결과들이 소선원 밀도자료의 품질에 이상이 있음을 보였다. 소선원 밀도자료에서의 품질 이상은 검층밀도 결정에서 소선원 밀도검층기가 지니는 선원 특성과 연결되었으며 결론적으로 지금까지의 소선원 밀도자료는 정확성을 유지하기 위한 최소의 조건이 충족되지 못한 상태에서 얻어진 것으로 판단하였다. 끝으로 코어자료와 표준선원 자료를 사용하여 남한지역 주요 지층의 밀도 분포 특성이 결정되었다.

A Modified Approach to Density-Induced Support Vector Data Description

  • Park, Joo-Young;Kang, Dae-Sung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제7권1호
    • /
    • pp.1-6
    • /
    • 2007
  • The SVDD (support vector data description) is one of the most well-known one-class support vector learning methods, in which one tries the strategy of utilizing balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. Recently, with the objective of generalizing the SVDD which treats all training data with equal importance, the so-called D-SVDD (density-induced support vector data description) was proposed incorporating the idea that the data in a higher density region are more significant than those in a lower density region. In this paper, we consider the problem of further improving the D-SVDD toward the use of a partial reference set for testing, and propose an LMI (linear matrix inequality)-based optimization approach to solve the improved version of the D-SVDD problems. Our approach utilizes a new class of density-induced distance measures based on the RSDE (reduced set density estimator) along with the LMI-based mathematical formulation in the form of the SDP (semi-definite programming) problems, which can be efficiently solved by interior point methods. The validity of the proposed approach is illustrated via numerical experiments using real data sets.

Piecewise Continuous Linear Density Estimator

  • Jang, Dae-Heung
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권4호
    • /
    • pp.959-968
    • /
    • 2005
  • The piecewise linear histogram can be used as a simple and efficient tool for the density estimator. But, this piecewise linear histogram is discontinuous function. We suppose the piecewise continuous linear histogram as a simple and efficient tool for the density estimator and the alternative of the piecewise linear histogram.

  • PDF

A Note on Support Vector Density Estimation with Wavelets

  • Lee, Sung-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권2호
    • /
    • pp.411-418
    • /
    • 2005
  • We review support vector and wavelet density estimation. The relationship between support vector and wavelet density estimation in reproducing kernel Hilbert space (RKHS) is investigated in order to use wavelets as a variety of support vector kernels in support vector density estimation.

  • PDF

Reducing Bias of the Minimum Hellinger Distance Estimator of a Location Parameter

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • 제17권1호
    • /
    • pp.213-220
    • /
    • 2006
  • Since Beran (1977) developed the minimum Hellinger distance estimation, this method has been a popular topic in the field of robust estimation. In the process of defining a distance, a kernel density estimator has been widely used as a density estimator. In this article, however, we show that a combination of a kernel density estimator and an empirical density could result a smaller bias of the minimum Hellinger distance estimator than using just a kernel density estimator for a location parameter.

  • PDF

A note on nonparametric density deconvolution by weighted kernel estimators

  • Lee, Sungho
    • Journal of the Korean Data and Information Science Society
    • /
    • 제25권4호
    • /
    • pp.951-959
    • /
    • 2014
  • Recently Hazelton and Turlach (2009) proposed a weighted kernel density estimator for the deconvolution problem. In the case of Gaussian kernels and measurement error, they argued that the weighted kernel density estimator is a competitive estimator over the classical deconvolution kernel estimator. In this paper we consider weighted kernel density estimators when sample observations are contaminated by double exponentially distributed errors. The performance of the weighted kernel density estimators is compared over the classical deconvolution kernel estimator and the kernel density estimator based on the support vector regression method by means of a simulation study. The weighted density estimator with the Gaussian kernel shows numerical instability in practical implementation of optimization function. However the weighted density estimates with the double exponential kernel has very similar patterns to the classical kernel density estimates in the simulations, but the shape is less satisfactory than the classical kernel density estimator with the Gaussian kernel.

Ferroelectric ultra high-density data storage based on scanning nonlinear dielectric microscopy

  • Cho, Ya-Suo;Odagawa, Nozomi;Tanaka, Kenkou;Hiranaga, Yoshiomi
    • 정보저장시스템학회논문집
    • /
    • 제3권2호
    • /
    • pp.94-112
    • /
    • 2007
  • Nano-sized inverted domain dots in ferroelectric materials have potential application in ultrahigh-density rewritable data storage systems. Herein, a data storage system is presented based on scanning non-linear dielectric microscopy and a thin film of ferroelectric single-crystal lithium tantalite. Through domain engineering, we succeeded to form an smallest artificial nano-domain single dot of 5.1 nm in diameter and artificial nano-domain dot-array with a memory density of 10.1 Tbit/$inch^2$ and a bit spacing of 8.0 nm, representing the highest memory density for rewritable data storage reported to date. Sub-nanosecond (500psec) domain switching speed also has been achieved. Next, long term retention characteristic of data with inverted domain dots is investigated by conducting heat treatment test. Obtained life time of inverted dot with the radius of 50nm was 16.9 years at $80^{\circ}C$. Finally, actual information storage with low bit error and high memory density was performed. A bit error ratio of less than $1\times10^{-4}$ was achieved at an areal density of 258 Gbit/inch2. Moreover, actual information storage is demonstrated at a density of 1 Tbit/$inch^2$.

  • PDF

Estimation of Lower Jaw Density using CT data

  • Jargalsaikhan, Ariunbold;Sengee, Nyamlkhagva;Telue, Berekjan;Ochirkhvv, Sambuu
    • Journal of Multimedia Information System
    • /
    • 제6권2호
    • /
    • pp.67-74
    • /
    • 2019
  • Bone density is one of the factors in the early failure of dental implants and doctors should make a preoperative assessment of jaw bone density using patient's CT data before dental implant surgery in order to find out whether the patient has osteoporosis and osteopenia. The main goal of this study was to propose a method that based on image processing techniques in order to provide accurate information about where to drill and place an abutment screw of implants in the jaw bone for doctors and reduce human activity for the estimation of the local cancellous bone density of mandible using CT data. The experiment was performed on a computed tomography data of the jaw bone of two different individuals. We assumed that the result of the estimation of jaw bone density depends on the angle of drilling and average HU (Hounsfield Unit) values were used to evaluate the quality of local cancellous bone density of mandible. As a result of this study, we have been developed a toolbox that can be used to estimate jaw bone density automatically and found a positive correlation between the angle of the drill and time complexity but a negative correlation between the diameter of the drill and time complexity.