• Title/Summary/Keyword: Sampled-data

Search Result 1,454, Processing Time 0.028 seconds

INTRODUCING tlc_s05: A CODE TO FIT CEPHEID JHK BAND LIGHT CURVES USING A TEMPLATE APPROACH

  • NGEOW, CHOW-CHOONG;KANBUR, SHASHI M.
    • Publications of The Korean Astronomical Society
    • /
    • v.30 no.2
    • /
    • pp.225-227
    • /
    • 2015
  • We introduce a code called tlc_s05, to fit sparsely sampled JHK band Cepheid light curve data with template light curves to derive the mean magnitude. A brief description of the code is provided here. We tested the performance of the code in deriving the mean JHK band magnitudes using simulations, and we found that it is better to observe more than four evenly spaced data points per light curve, which permits tlc_s05 to derive accurate mean magnitudes for Cepheid JHK band light curves.

Generating and Validating Synthetic Training Data for Predicting Bankruptcy of Individual Businesses

  • Hong, Dong-Suk;Baik, Cheol
    • Journal of information and communication convergence engineering
    • /
    • v.19 no.4
    • /
    • pp.228-233
    • /
    • 2021
  • In this study, we analyze the credit information (loan, delinquency information, etc.) of individual business owners to generate voluminous training data to establish a bankruptcy prediction model through a partial synthetic training technique. Furthermore, we evaluate the prediction performance of the newly generated data compared to the actual data. When using conditional tabular generative adversarial networks (CTGAN)-based training data generated by the experimental results (a logistic regression task), the recall is improved by 1.75 times compared to that obtained using the actual data. The probability that both the actual and generated data are sampled over an identical distribution is verified to be much higher than 80%. Providing artificial intelligence training data through data synthesis in the fields of credit rating and default risk prediction of individual businesses, which have not been relatively active in research, promotes further in-depth research efforts focused on utilizing such methods.

Objective Estimation of Velocity Streamfunction Field with Discretely Sampled Oceanic Data 11: with Application of Least-square Regression Analysis (객관적 분석을 통한 속도 유선함수(streamfunction) 산출 II: 최소자승 회귀분석법의 응용)

  • 조광우
    • Journal of Environmental Science International
    • /
    • v.6 no.5
    • /
    • pp.541-550
    • /
    • 1997
  • A least-square regression analysis is applied for the estimation of velocity streamfunction field based on discretely sampled current meter data. The coefficients of a streamfuunction that is expanded in terms of trigonometric basis function are obtained by enforcing the horizontal non-divergence of two-dimensional flow field. This method avoids Interpolation and gives a root-mean-square (rms) residual of fit which Includes the divergent part and noisiness of oceanic data. The Implementation of the method Is done by employing a boundary-fitted, curvilinear orthogonal coordinate which facilitates the specification of boundary conditions. An application is successfully made to the Texas-Louisiana shelf using the 32 months current meter data (31 moorings) observed as a part of the Texas-Louisiana Shelf and Transport Processes Study (LATEX). The rms residual of the fitting is relatively small for the shelf, which indicates the field Is Ivell represented by the streamnfunction.

  • PDF

Real-time Localization of An UGV based on Uniform Arc Length Sampling of A 360 Degree Range Sensor (전방향 거리 센서의 균일 원호길이 샘플링을 이용한 무인 이동차량의 실시간 위치 추정)

  • Park, Soon-Yong;Choi, Sung-In
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.48 no.6
    • /
    • pp.114-122
    • /
    • 2011
  • We propose an automatic localization technique based on Uniform Arc Length Sampling (UALS) of 360 degree range sensor data. The proposed method samples 3D points from dense a point-cloud which is acquired by the sensor, registers the sampled points to a digital surface model(DSM) in real-time, and determines the location of an Unmanned Ground Vehicle(UGV). To reduce the sampling and registration time of a sequence of dense range data, 3D range points are sampled uniformly in terms of ground sample distance. Using the proposed method, we can reduce the number of 3D points while maintaining their uniformity over range data. We compare the registration speed and accuracy of the proposed method with a conventional sample method. Through several experiments by changing the number of sampling points, we analyze the speed and accuracy of the proposed method.

Wavelet Based Matching Pursuit Method for Interpolation of Seismic Trace with Spatial Aliasing (공간적인 알리아싱을 포함한 탄성파 트레이스의 내삽을 위한 요소파 기반의 Matching Pursuit 기법)

  • Choi, Jihun;Byun, Joongmoo;Seol, Soon Jee
    • Geophysics and Geophysical Exploration
    • /
    • v.17 no.2
    • /
    • pp.88-94
    • /
    • 2014
  • Due to mechanical failure or geographical accessibility, the seismic data can be partially missed. In addition, it can be coarsely sampled such as crossline of the marine streamer data. This seismic data that irregular sampled and spatial aliased may cause problems during seismic data processing. Accurate and efficient interpolation method can solve this problem. Futhermore, interpolation can save the acquisition cost and time by reducing the number of shots and receivers. Among various interpolation methods, the Matching Pursuit method can be applied to any sampling type which is regular or irregular. However, in case of using sinusoidal basis function, this method has a limitation in spatial aliasing. Therefore, in this study, we have developed wavelet based Matching Pursuit method that uses wavelet instead of sinusoidal function for the improvement of dealiasing performance. In addition, we have improved interpolation speed by using inner product instead of L-2 norm.

A simple and efficient data loss recovery technique for SHM applications

  • Thadikemalla, Venkata Sainath Gupta;Gandhi, Abhay S.
    • Smart Structures and Systems
    • /
    • v.20 no.1
    • /
    • pp.35-42
    • /
    • 2017
  • Recently, compressive sensing based data loss recovery techniques have become popular for Structural Health Monitoring (SHM) applications. These techniques involve an encoding process which is onerous to sensor node because of random sensing matrices used in compressive sensing. In this paper, we are presenting a model where the sampled raw acceleration data is directly transmitted to base station/receiver without performing any type of encoding at transmitter. The received incomplete acceleration data after data losses can be reconstructed faithfully using compressive sensing based reconstruction techniques. An in-depth simulated analysis is presented on how random losses and continuous losses affects the reconstruction of acceleration signals (obtained from a real bridge). Along with performance analysis for different simulated data losses (from 10 to 50%), advantages of performing interleaving before transmission are also presented.

THE EFFECTS OF UNCERTAIN TOPOGRAPHIC DATA ON SPATIAL PREDICTION OF LANDSLIDE HAZARD

  • Park, No-Wook;Kyriakidis, Phaedon C.
    • Proceedings of the KSRS Conference
    • /
    • 2008.10a
    • /
    • pp.259-261
    • /
    • 2008
  • GIS-based spatial data integration tasks have used exhaustive thematic maps generated from sparsely sampled data or satellite-based exhaustive data. Due to a simplification of reality and error in mapping procedures, such spatial data are usually imperfect and of different accuracy. The objective of this study is to carry out a sensitivity analysis in connection with input topographic data for landslide hazard mapping. Two different types of elevation estimates, elevation spot heights and a DEM from ASTER stereo images are considered. The geostatistical framework of kriging is applied for generating more reliable elevation estimates from both sparse elevation spot heights and exhaustive ASTER-based elevation values. The effects of different accuracy arising from different terrain-related maps on the prediction performance of landslide hazard are illustrated from a case study of Boeun, Korea.

  • PDF

Object-Oriented Field Information Management Program Developed for Precision Agriculture

  • Sung J. H.;Choi K. M.
    • Agricultural and Biosystems Engineering
    • /
    • v.4 no.2
    • /
    • pp.50-57
    • /
    • 2003
  • This study was conducted to develop software which provides automatic site-specific field data acquisition, data processing, data mapping and management for precision agriculture. The developed software supports acquisition and processing of both digital and analog data streams. The architecture was object-oriented and each component in the architecture was developed as a separate class. In precision agriculture research, the laborious task of manual ground-truth data collection will be avoided using the developed software. In addition, gathering high-density data eliminates the need for interpolation of values for un-sampled areas. This software shows good potential for expansion and compatibility for variable-rate-application (VRA). The FIM (Field Information Management) computer program provides the user with an easy-to-follow process for field information management for precision agriculture.

  • PDF

Estimation of track irregularity using NARX neural network (NARX 신경망을 이용한 철도 궤도틀림 추정)

  • Kim, Man-Cheol;Choi, Bai-Sung;Kim, Yu-Hee;Shin, Soob-Ong
    • Proceedings of the KSR Conference
    • /
    • 2011.10a
    • /
    • pp.275-280
    • /
    • 2011
  • Due to high-speed of trains, the track deformation increases rapidly and may lead to track irregularities causing the track stability problem. To secure the track stability, the continual inspection on track irregularities is required. The paper presents a methodology for identifying track irregularity using the NARX neural network considering non-linearity in the train structural system. A simulation study has been carried out to examine the proposed method. Acceleration time history data measured at a bogie were re-sampled to every 0.25m track irregularity. In the simulation study, two sets of measured data were simulated. The second data set was obtained by a train with 10% more mass than the one for the first data set. The first set of simulated data was used to train the series-parallel mode of NARX neural network. Then, the track irregularities at the second time period are identified by using the measured acceleration data. The closeness of the identified track irregularity to the actual one is evaluated by PSD and RMSE.

  • PDF

Big Data Analysis and Prediction of Traffic in Los Angeles

  • Dauletbak, Dalyapraz;Woo, Jongwook
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.2
    • /
    • pp.841-854
    • /
    • 2020
  • The paper explains the method to process, analyze and predict traffic patterns in Los Angeles county using Big Data and Machine Learning. The dataset is used from a popular navigating platform in the USA, which tracks information on the road using connected users' devices and also collects reports shared by the users through the app. The dataset mainly consists of information about traffic jams and traffic incidents reported by users, such as road closure, hazards, accidents. The major contribution of this paper is to give a clear view of how the large-scale road traffic data can be stored and processed using the Big Data system - Hadoop and its ecosystem (Hive). In addition, analysis is explained with the help of visuals using Business Intelligence and prediction with classification machine learning model on the sampled traffic data is presented using Azure ML. The process of modeling, as well as results, are interpreted using metrics: accuracy, precision and recall.