• Title/Summary/Keyword: Sampling-Based Algorithm

Search Result 477, Processing Time 0.028 seconds

A Design and Implementation of Volume Rendering Program based on 3D Sampling (3차원 샘플링에 기만을 둔 볼륨랜더링 프로그램의 설계 및 구현)

  • 박재영;이병일;최흥국
    • Journal of Korea Multimedia Society
    • /
    • v.5 no.5
    • /
    • pp.494-504
    • /
    • 2002
  • Volume rendering is a method of displaying volumetric data as a sequence two-dimensional image. Because this algorithm has an advantage of visualizing structures within objects, it has recently been used to analyze medical images i.e, MRI, PET, and SPECT. In this paper. we suggested a method for creating images easily from sampled volumetric data and applied the interpolation method to medical images. Additionally, we implemented and applied two kinds of interpolation methods to improve the image quality, linear interpolation and cubic interpolation at the sampling stage. Subsequently, we compared the results of volume rendered data using a transfer function. We anticipate a significant contribution to diagnosis through image reconstruction using a volumetric data set, because volume rendering techniques of medical images are the result of 3-dimensional data.

  • PDF

Material Optimization of BIW for Minimizing Weight (경량화를 위한 BIW 소재 최적설계)

  • Jin, Sungwan;Park, Dohyun;Lee, Gabseong;Kim, Chang Won;Yang, Heui Won;Kim, Dae Seung;Choi, Dong-Hoon
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.21 no.4
    • /
    • pp.16-22
    • /
    • 2013
  • In this study, we propose the method of optimally changing material of BIW for minimizing weight while satisfying vehicle requirements on static stiffness. First, we formulate a material selection optimization problem. Next, we establish the CAE procedure of evaluating static stiffness. Then, to enhance the efficiency of design work, we integrate and automate the established CAE procedure using a commercial process integration and design optimization (PIDO) tool, PIAnO. For effective optimization, we adopt the approach of metamodel based approximate optimization. As a sampling method, an orthogonal array (OA) is used for selecting sampling points. The response values are evaluated at the sampling points and then these response values are used to generate a metamodel of each response using the linear polynomial regression (PR) model. Using the linear PR model, optimization is carried out an evolutionary algorithm (EA) that can handle discrete design variables. Material optimization result reveals that the weight is reduced by 44.8% while satisfying all the design constraints.

Material Selection Optimization of A-Pillar and Package Tray Using RBFr Metamodel for Minimizing Weight (경량화를 위한 RBFr 메타모델 기반 A-필러와 패키지 트레이의 소재 선정 최적화)

  • Jin, Sungwan;Park, Dohyun;Lee, Gabseong;Kim, Chang Won;Yang, Heui Won;Kim, Dae Seung;Choi, Dong-Hoon
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.21 no.5
    • /
    • pp.8-14
    • /
    • 2013
  • In this study, we propose the method of optimally selecting material of front pillar (A-pillar) and package tray for minimizing weight while satisfying vehicle requirements on static stiffness and dynamic stiffness. First, we formulate a material selection optimization problem. Next, we establish the CAE procedure of evaluating static stiffness and dynamic stiffness. Then, to enhance the efficiency of design work, we integrate and automate the established CAE procedure using a commercial process integration and design optimization (PIDO) tool, PIAnO. For effective optimization, we adopt the approach of metamodel based approximate optimization. As a sampling method, an orthogonal array (OA) is used for selecting sampling points. The response values are evaluated at the sampling points and then these response values are used to generate a metamodel of each response using the radial basis function regression (RBFr). Using the RBFr models, optimization is carried out an evolutionary algorithm that can handle discrete design variables. Material optimization result reveals that the weight is reduced by 49.8% while satisfying all the design constraints.

Development of a Natural Target-based Edge Analysis Method for NIIRS Estimation (NIIRS 추정을 위한 자연표적 기반의 에지분석기법 개발)

  • Kim, Jae-In;Kim, Tae-Jung
    • Korean Journal of Remote Sensing
    • /
    • v.27 no.5
    • /
    • pp.587-599
    • /
    • 2011
  • As one measure of image interpretability, NIIRS(National Imagery Interpretability Rating Scale) has been used. Unlike MTF(Modulation Transfer Function), SNR(Signal to Noise Ratio), and GSD(Ground Sampling Distance), NIIRS can describe the quality of overall image at user's perspective. NIIRS is observed with human observation directly or estimated by edge analysis. For edge analysis specially manufactured artificial target is used commonly. This target, formed with a tarp of black and white patterns, is deployed on the ground and imaged by the satellite. Due to this, the artificial target-based method needs a big expense and can not be performed often. In this paper, we propose a new edge analysis method that enables to estimate NIIRS accurately. In this method, natural targets available in the image are used and characteristics of the target are considered. For assessment of the algorithm, various experiments were carried out. The results showed that our algorithm can be used as an alternative to the artificial target-based method.

Efficient 3D Object Simplification Algorithm Using 2D Planar Sampling and Wavelet Transform (2D 평면 표본화와 웨이브릿 변환을 이용한 효율적인 3차원 객체 간소화 알고리즘)

  • 장명호;이행석;한규필;박양우
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.31 no.5_6
    • /
    • pp.297-304
    • /
    • 2004
  • In this paper, a mesh simplification algorithm based on wavelet transform and 2D planar sampling is proposed for efficient handling of 3D objects in computer applications. Since 3D vertices are directly transformed with wavelets in conventional mesh compression and simplification algorithms, it is difficult to solve tiling optimization problems which reconnect vertices into faces in the synthesis stage highly demanding vertex connectivities. However, a 3D mesh is sampled onto 2D planes and 2D polygons on the planes are independently simplified in the proposed algorithm. Accordingly, the transform of 2D polygons is very tractable and their connection information Is replaced with a sequence of vertices. The vertex sequence of the 2D polygons on each plane is analyzed with wavelets and the transformed data are simplified by removing small wavelet coefficients which are not dominant in the subjective quality of its shape. Therefore, the proposed algorithm is able to change the mesh level-of-detail simply by controlling the distance of 2D sampling planes and the selective removal of wavelet coefficients. Experimental results show that the proposed algorithm is a simple and efficient simplification technique with less external distortion.

A High-Speed Synchronization Method Robust to the Effect of Initial SFO in DRM Systems (DRM 시스템에서 초기 샘플링 주파수 옵셋의 영향에 강인한 고속 동기화 방식)

  • Kwon, Ki-Won;Cho, Yong-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.37 no.1A
    • /
    • pp.73-81
    • /
    • 2012
  • In this paper, we propose a high-speed synchronization method for Digital Radio Mondiale (DRM) receivers. In order to satisfy the high-speed synchronization requirement of DRM receivers, the proposed method eliminate the initial sampling frequency synchronization process in conventional synchronization methods. In the proposed method, sampling frequency tracking is performed after integer frequency synchronization and frame synchronization. Different correlation algorithms are applied to detect the first frame of the Orthogonal Frequency Division Multiplexing (OFDM) demodulation symbol with sampling frequency offset (SFO). A frame detection algorithm that is robust to SFO is selected based on the performance analysis and simulation. Simulation results show that the proposed method reduces the time spent for initial sampling frequency synchronization even if SFO is present in the DRM signal. In addition, it is verify that inter-cell differential correlation used between reference cells is roubst to the effect of initial SFO.

A Deep Learning Based Over-Sampling Scheme for Imbalanced Data Classification (불균형 데이터 분류를 위한 딥러닝 기반 오버샘플링 기법)

  • Son, Min Jae;Jung, Seung Won;Hwang, Een Jun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.8 no.7
    • /
    • pp.311-316
    • /
    • 2019
  • Classification problem is to predict the class to which an input data belongs. One of the most popular methods to do this is training a machine learning algorithm using the given dataset. In this case, the dataset should have a well-balanced class distribution for the best performance. However, when the dataset has an imbalanced class distribution, its classification performance could be very poor. To overcome this problem, we propose an over-sampling scheme that balances the number of data by using Conditional Generative Adversarial Networks (CGAN). CGAN is a generative model developed from Generative Adversarial Networks (GAN), which can learn data characteristics and generate data that is similar to real data. Therefore, CGAN can generate data of a class which has a small number of data so that the problem induced by imbalanced class distribution can be mitigated, and classification performance can be improved. Experiments using actual collected data show that the over-sampling technique using CGAN is effective and that it is superior to existing over-sampling techniques.

Teaching and learning about informal statistical inference using sampling simulation : A cultural-historical activity theory analysis (표집 시뮬레이션을 활용한 비형식적 통계적 추리의 교수-학습: 문화-역사적 활동이론의 관점에 따른 분석)

  • Seo Minju;Seo Yumin;Jung Hye-­Yun;Lee Kyeong-­Hwa
    • Journal of the Korean School Mathematics Society
    • /
    • v.26 no.1
    • /
    • pp.21-47
    • /
    • 2023
  • This study examines the activity system of teaching and learning about informal statistical inference using sampling simulation, based on cultural-historical activity theory. The research explores what contradictions arise in the activity system and how the system changes as a result of these contradictions. The participants were 20 elementary school students in the 5th to 6th grades who received classes on informal statistical inference using sampling simulations. Thematic analysis was used to analyze the data. The findings show that a contradiction emerged between the rule and the object, as well as between the mediating artifact and the object. It was confirmed that visualization of empirical sampling distribution was introduced as a new artifact while resolving these contradictions. In addition, contradictions arose between the subject and the rule and between the rule and the mediating artifact. It was confirmed that an algorithm to calculate the mean of the sample means was introduced as a new rule while resolving these contradictions.

Monte Carlo Estimation of Multivariate Normal Probabilities

  • Oh, Man-Suk;Kim, Seung-Whan
    • Journal of the Korean Statistical Society
    • /
    • v.28 no.4
    • /
    • pp.443-455
    • /
    • 1999
  • A simulation-based approach to estimating the probability of an arbitrary region under a multivariate normal distribution is developed. In specific, the probability is expressed as the ratio of the unrestricted and the restricted multivariate normal density functions, where the restriction is given by the region whose probability is of interest. The density function of the restricted distribution is then estimated by using a sample generated from the Gibbs sampling algorithm.

  • PDF

Hexagon-shape Line Search Algorithm for Fast Motion Estimation on Media Processor (미디어프로세서 상의 고속 움직임 탐색을 위한 Hexagon 모양 라인 탐색 알고리즘)

  • Jung Bong-Soo;Jeon Byeung-Woo
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.43 no.4 s.310
    • /
    • pp.55-65
    • /
    • 2006
  • Most of fast block motion estimation algorithms reported so far in literatures aim to reduce the computation in terms of the number of search points, thus do not fit well with multimedia processors due to their irregular data flow. For multimedia processors, proper reuse of data is more important than reducing number of absolute difference operations because the execution cycle performance strongly depends on the number of off-chip memory access. Therefore, in this paper, we propose a Hexagon-shape line search (HEXSLS) algorithm using line search pattern which can increase data reuse from on-chip local buffer, and check sub-sampling points in line search pattern to reduce unnecessary SAD operation. Our experimental results show that the prediction error (MAE) performance of the proposed HEXSLS is similar to that of the full search block matching algorithm (FSBMA), while compared with the hexagon-based search (HEXBS), the HEXSLS outperforms. Also the proposed HEXSLS requires much lesser off-chip memory access than the conventional fast motion estimation algorithm such as the hexagon-based search (HEXBS) and the predictive line search (PLS). As a result, the proposed HEXSLS algorithm requires smaller number of execution cycles on media processor.