• Title/Summary/Keyword: data sampling

Search Result 5,056, Processing Time 0.034 seconds

Novel Compressed Sensing Techniques for Realistic Image (실감 영상을 위한 압축 센싱 기법)

  • Lee, Sun Yui;Jung, Kuk Hyun;Kim, Jin Young;Park, Gooman
    • Journal of Satellite, Information and Communications
    • /
    • v.9 no.3
    • /
    • pp.59-63
    • /
    • 2014
  • This paper describes the basic principles of 3D broadcast system and proposes new 3D broadcast technology that reduces the amount of data by applying CS(Compressed Sensing). Differences between Sampling theory and the CS technology concept were described. Recently proposed CS algorithm AMP(Approximate Message Passing) and CoSaMP(Compressive Sampling Matched Pursuit) were described. This paper compared an accuracy between two algorithms and a calculation time that image data compressed and restored by these algorithms. As result determines a low complexity algorithm for 3D broadcast system.

Monitoring Benthic AIgal Communides:A Comparison of Targeted and Coefficient Sampling Methods

  • Edwards, Matthew S.;Tinker, Martin T.
    • ALGAE
    • /
    • v.24 no.2
    • /
    • pp.111-120
    • /
    • 2009
  • Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numer-ous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with twomethods commonly used to sample benthic organisms in temperatc kelp forests. One of these methods, theTargeted Sampling method, relies on different sample units, each "targeted" for a specific species or group ofspecies while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both meth-ods yield remarkably similar estimates of organisnm abundance and among-site variability, although the Coefficientmethod slightly underestimates variability armong sample units when abundances are low. In contrast, the twomethods differ considerably in the effort needed to sample these communities; the Targeted Sampling requiresmore time and twice the persormel to complete. We conclude that the Coeffident Sampling metliod may be bettcrfor environmental monitoring programs where changes in mean abundance are of central conccm and resources arelimiting, but that the Targeted sampling methods may be better for ecological studies where quantitative reIation-ships among species and small-scale variability in abundance are of central concern.

Quantification of Uncertainty Associated with Soil Sampling and Its Reduction Approaches (토양오염도 평가시 시료채취 불확실성 정량화 및 저감방안)

  • Kim, Geonha
    • Journal of Soil and Groundwater Environment
    • /
    • v.18 no.1
    • /
    • pp.94-101
    • /
    • 2013
  • It is well known that uncertainty associated with soil sampling is bigger than that with analysis. In this research, uncertainties for soil sampling when assessing TPH and BTEX concentration in soils were quantified based on actual field data. It is almost impossible to assess exact contamination of the site regardless how carefully devised for sampling. Uncertainties associated with sample reduction for further chemical analysis were quantified approximately 10 times larger than those associated with core sampling on site. Bigger uncertainties occur when contamination level is low, sample quantity is small, and soil particle is coarse. To minimize the uncertainties on field, homogenization of soil sample is necessary and its procedures are proposed in this research as well.

A Study for Time Standard Estimation with Activity Sampling Method (가동샘플링기법에 의한 표준시간추정에 관한 연구)

  • 이근희
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.6 no.9
    • /
    • pp.1-5
    • /
    • 1983
  • This study takes over the application of survey sampling theory to activity sampling and the application of activity sampling to time standard estimation. Cluster, stratified, and multistage sampling are studied in conjunction with random and systematic sampling. Estimation procedures that will maximize the information obtained per cost expended on the study and specification of the procedure to be used to estimate the accuracy of the estimates for the adopted procedure are considered. The use of multiple regression md linear programming to estimate standard element performance time from typical job lot production data is also considered.

  • PDF

Compensation Methods for Non-uniform and Incomplete Data Sampling in High Resolution PET with Multiple Scintillation Crystal Layers (다중 섬광결정을 이용한 고해상도 PET의 불균일/불완전 데이터 보정기법 연구)

  • Lee, Jae-Sung;Kim, Soo-Mee;Lee, Kwon-Song;Sim, Kwang-Souk;Rhe, June-Tak;Park, Kwang-Suk;Lee, Dong-Soo;Hong, Seong-Jong
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.42 no.1
    • /
    • pp.52-60
    • /
    • 2008
  • Purpose: To establish the methods for sinogram formation and correction in order to appropriately apply the filtered backprojection (FBP) reconstruction algorithm to the data acquired using PET scanner with multiple scintillation crystal layers. Materials and Methods: Formation for raw PET data storage and conversion methods from listmode data to histogram and sinogram were optimized. To solve the various problems occurred while the raw histogram was converted into sinogram, optimal sampling strategy and sampling efficiency correction method were investigated. Gap compensation methods that is unique in this system were also investigated. All the sinogram data were reconstructed using 20 filtered backprojection algorithm and compared to estimate the improvements by the correction algorithms. Results: Optimal radial sampling interval and number of angular samples in terms of the sampling theorem and sampling efficiency correction algorithm were pitch/2 and 120, respectively. By applying the sampling efficiency correction and gap compensation, artifacts and background noise on the reconstructed image could be reduced. Conclusion: Conversion method from the histogram to sinogram was investigated for the FBP reconstruction of data acquired using multiple scintillation crystal layers. This method will be useful for the fast 20 reconstruction of multiple crystal layer PET data.

Acceptance Sampling Plans in the Rayleigh Model

  • Baklizi Ayman;El-Masri Abedel-Qader;AL-Nasser Amjad
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.11-18
    • /
    • 2005
  • Assume that the life times of the units under test follow the Rayleigh distribution and the test is terminated at a pre assigned time. Acceptance sampling plans are developed for this situation. The minimum sample size necessary to ensure the specified average life are obtained and the operating characteristic values of the sampling plans and producer's risk are given. An example is given to illustrate the methodology.

FASIM: Fragments Assembly Simulation using Biased-Sampling Model and Assembly Simulation for Microbial Genome Shotgun Sequencing

  • Hur Cheol-Goo;Kim Sunny;Kim Chang-Hoon;Yoon Sung-Ho;In Yong-Ho;Kim Cheol-Min;Cho Hwan-Gue
    • Journal of Microbiology and Biotechnology
    • /
    • v.16 no.5
    • /
    • pp.683-688
    • /
    • 2006
  • We have developed a program for generating shotgun data sets from known genome sequences. Generation of synthetic data sets by computer program is a useful alternative to real data to which students and researchers have limited access. Uniformly-distributed-sampling clones that were adopted by previous programs cannot account for the real situation where sampled reads tend to come from particular regions of the target genome. To reflect such situation, a probabilistic model for biased sampling distribution was developed by using an experimental data set derived from a microbial genome project. Among the experimental parameters tested (varied fragment or read lengths, chimerism, and sequencing error), the extent of sequencing error was the most critical factor that hampered sequence assembly. We propose that an optimum sequencing strategy employing different insert lengths and redundancy can be established by performing a variety of simulations.

Full-color Non-hogel-based Computer-generated Hologram from Light Field without Color Aberration

  • Min, Dabin;Min, Kyosik;Park, Jae-Hyeung
    • Current Optics and Photonics
    • /
    • v.5 no.4
    • /
    • pp.409-420
    • /
    • 2021
  • We propose a method to synthesize a color non-hogel-based computer-generated-hologram (CGH) from light field data of a three-dimensional scene with a hologram pixel pitch shared for all color channels. The non-hogel-based CGH technique generates a continuous wavefront with arbitrary carrier wave from given light field data by interpreting the ray angle in the light field to the spatial frequency of the plane wavefront. The relation between ray angle and spatial frequency is, however, dependent on the wavelength, which leads to different spatial frequency sampling grid in the light field data, resulting in color aberrations in the hologram reconstruction. The proposed method sets a hologram pixel pitch common to all color channels such that the smallest blue diffraction angle covers the field of view of the light field. Then a spatial frequency sampling grid common to all color channels is established by interpolating the light field with the spatial frequency range of the blue wavelength and the sampling interval of the red wavelength. The common hologram pixel pitch and light field spatial frequency sampling grid ensure the synthesis of a color hologram without any color aberrations in the hologram reconstructions, or any loss of information contained in the light field. The proposed method is successfully verified using color light field data of various test or natural 3D scenes.

CHAID Algorithm by Cube-based Proportional Sampling

  • Park, Hee-Chang;Cho, Kwang-Hyun
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 2004.04a
    • /
    • pp.39-50
    • /
    • 2004
  • The decision tree approach is most useful in classification problems and to divide the search space into rectangular regions. Decision tree algorithms are used extensively for data mining in many domains such as retail target marketing, fraud dection, data reduction and variable screening, category merging, etc. CHAID(Chi-square Automatic Interaction Detector) uses the chi-squired statistic to determine splitting and is an exploratory method used to study the relationship between a dependent variable and a series of predictor variables. In this paper we propose CHAID algorithm by cube-based proportional sampling and explore CHAID algorithm in view of accuracy and speed by the number of variables.

  • PDF

Sampling Based Approach to Bayesian Analysis of Binary Regression Model with Incomplete Data

  • Chung, Young-Shik
    • Journal of the Korean Statistical Society
    • /
    • v.26 no.4
    • /
    • pp.493-505
    • /
    • 1997
  • The analysis of binary data appears to many areas such as statistics, biometrics and econometrics. In many cases, data are often collected in which some observations are incomplete. Assume that the missing covariates are missing at random and the responses are completely observed. A method to Bayesian analysis of the binary regression model with incomplete data is presented. In particular, the desired marginal posterior moments of regression parameter are obtained using Meterpolis algorithm (Metropolis et al. 1953) within Gibbs sampler (Gelfand and Smith, 1990). Also, we compare logit model with probit model using Bayes factor which is approximated by importance sampling method. One example is presented.

  • PDF