• Title/Summary/Keyword: STEP-Based Data Model

Search Result 702, Processing Time 0.027 seconds

Application of k-w turbulence model to the analysis of the flow through a single stage axial-flow compressor (단단 축류압축기 유동해석에 대한 k-w 난류모델의 응용)

  • Lee, Joon-Suk;Kim, Kwang-Yong
    • The KSFM Journal of Fluid Machinery
    • /
    • v.3 no.3 s.8
    • /
    • pp.7-11
    • /
    • 2000
  • A numerical study based on the three-dimensional thin-layer Navier-Stokes solver is carried out to analyze the flowfield through a single stage transonic compressor. Explicit fout-step Runge-Kutta scheme with spatially variable time step and implicit residual smoothing is used. The governing equations we discretized with explcit finite difference method. Mired-out average method is used at the interface between rotor and stator. And, an artificial dissipation model is used to assure the stability of solution. The results with k-w turbulence model were compared to the results with Baldwin-Lomax model, and physical phenomena of transonic compressor are presented. The two turbulence models give the results that show reasonably good agreements with experimental data.

  • PDF

Research on Application of SIR-based Prediction Model According to the Progress of COVID-19 (코로나-19 진행에 따른 SIR 기반 예측모형적용 연구)

  • Hoon Kim;Sang Sup Cho;Dong Woo Chae
    • Journal of Information Technology Applications and Management
    • /
    • v.31 no.1
    • /
    • pp.1-9
    • /
    • 2024
  • Predicting the spread of COVID-19 remains a challenge due to the complexity of the disease and its evolving nature. This study presents an integrated approach using the classic SIR model for infectious diseases, enhanced by the chemical master equation (CME). We employ a Monte Carlo method (SSA) to solve the model, revealing unique aspects of the SARS-CoV-2 virus transmission. The study, a first of its kind in Korea, adopts a step-by-step and complementary approach to model prediction. It starts by analyzing the epidemic's trajectory at local government levels using both basic and stochastic SIR models. These models capture the impact of public health policies on the epidemic's dynamics. Further, the study extends its scope from a single-infected individual model to a more comprehensive model that accounts for multiple infections using the jump SIR prediction model. The practical application of this approach involves applying these layered and complementary SIR models to forecast the course of the COVID-19 epidemic in small to medium-sized local governments, particularly in Gangnam-gu, Seoul. The results from these models are then compared and analyzed.

Probabilistic penalized principal component analysis

  • Park, Chongsun;Wang, Morgan C.;Mo, Eun Bi
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.2
    • /
    • pp.143-154
    • /
    • 2017
  • A variable selection method based on probabilistic principal component analysis (PCA) using penalized likelihood method is proposed. The proposed method is a two-step variable reduction method. The first step is based on the probabilistic principal component idea to identify principle components. The penalty function is used to identify important variables in each component. We then build a model on the original data space instead of building on the rotated data space through latent variables (principal components) because the proposed method achieves the goal of dimension reduction through identifying important observed variables. Consequently, the proposed method is of more practical use. The proposed estimators perform as the oracle procedure and are root-n consistent with a proper choice of regularization parameters. The proposed method can be successfully applied to high-dimensional PCA problems with a relatively large portion of irrelevant variables included in the data set. It is straightforward to extend our likelihood method in handling problems with missing observations using EM algorithms. Further, it could be effectively applied in cases where some data vectors exhibit one or more missing values at random.

Spatial Information Based Simulator for User Experience's Optimization

  • Bang, Green;Ko, Ilju
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.3
    • /
    • pp.97-104
    • /
    • 2016
  • In this paper, we propose spatial information based simulator for user experience optimization and minimize real space complexity. We focus on developing simulator how to design virtual space model and to implement virtual character using real space data. Especially, we use expanded events-driven inference model for SVM based on machine learning. Our simulator is capable of feature selection by k-fold cross validation method for optimization of data learning. This strategy efficiently throughput of executing inference of user behavior feature by virtual space model. Thus, we aim to develop the user experience optimization system for people to facilitate mapping as the first step toward to daily life data inference. Methodologically, we focus on user behavior and space modeling for implement virtual space.

Generating Cartesian Tool Paths for Machining Sculptured Surfaces from 3D Measurement Data (3차원 측정자료부터 자유곡면의 가공을 위한 공구경로생성)

  • Ko, Byung-Chul;Kim, Kwang-Soo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.19 no.3
    • /
    • pp.123-137
    • /
    • 1993
  • In this paper, an integrated approach is proposed to generate gouging-free Cartesian tool paths for machining sculptured surfaces from 3D measurement data. The integrated CAD/CAM system consists of two modules : offset surface module an Carteian tool path module. The offset surface module generates an offset surface of an object from its 3D measurement data, using an offsetting method and a surface fitting method. The offsetting is based on the idea that the envelope of an inversed tool generates an offset surface without self-intersection as the center of the inversed tool moves along on the surface of an object. The surface-fitting is the process of constructing a compact representation to model the surface of an object based on a fairly large number of data points. The resulting offset surtace is a composite Bezier surface without self-intersection. When an appropriate tool-approach direction is selected, the tool path module generates the Cartesian tool paths while the deviation of the tool paths from the surface stays within the user-specified tolerance. The tool path module is a two-step process. The first step adaptively subdivides the offset surface into subpatches until the thickness of each subpatch is small enough to satisfy the user-defined tolerance. The second step generates the Cartesian tool paths by calculating the intersection of the slicing planes and the adaptively subdivided subpatches. This tool path generation approach generates the gouging-free Cartesian CL tool paths, and optimizes the cutter movements by minimizing the number of interpolated points.

  • PDF

Interpolation method of head-related transfer function based on the least squares method and an acoustic modeling with a small number of measurement points (최소자승법과 음향학적 모델링 기반의 적은 개수의 측정점에 대한 머리전달함수 보간 기법)

  • Lee, Seokjin
    • The Journal of the Acoustical Society of Korea
    • /
    • v.36 no.5
    • /
    • pp.338-344
    • /
    • 2017
  • In this paper, an interpolation method of HRTF (Head-Related Transfer Function) is proposed for small-sized measurement data set, especially. The proposed algorithm is based on acoustic modeling of HRTFs, and the algorithm tries to interpolate the HRTFs via estimation the model coefficients. However, the estimation of the model coefficients is hard if there is lack of measurement points, so the algorithm solves the problem by a data augmentation using the VBAP (Vector Based Amplitude Panning). Therefore, the proposed algorithm consists of two steps, which are data augmentation step based on VBAP and model coefficients estimation step by least squares method. The proposed algorithm was evaluated by a simulation with a measured data from CIPIC (Center for Image Processing and Integrated Computing) HRTF database, and the simulation results show that the proposed algorithm reduces mean-squared error by 1.5 dB ~ 4 dB than the conventional algorithms.

A hydrodynamic model of nearshore waves and wave-induced currents

  • Sief, Ahmed Khaled;Kuroiwa, Masamitsu;Abualtayef, Mazen;Mase, Hajime;Matsubara, Yuhei
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.3 no.3
    • /
    • pp.216-224
    • /
    • 2011
  • In This study develops a quasi-three dimensional numerical model of wave driven coastal currents with accounting the effects of the wave-current interaction and the surface rollers. In the wave model, the current effects on wave breaking and energy dissipation are taken into account as well as the wave diffraction effect. The surface roller associated with wave breaking was modeled based on a modification of the equations by Dally and Brown (1995) and Larson and Kraus (2002). Furthermore, the quasi-three dimensional model, which based on Navier-Stokes equations, was modified in association with the surface roller effect, and solved using frictional step method. The model was validated by data sets obtained during experiments on the Large Scale Sediment Transport Facility (LSTF) basin and the Hazaki Oceanographical Research Station (HORS). Then, a model test against detached breakwater was carried out to investigate the performance of the model around coastal structures. Finally, the model was applied to Akasaki port to verify the hydrodynamics around coastal structures. Good agreements between computations and measurements were obtained with regard to the cross-shore variation in waves and currents in nearshore and surf zone.

IFCXML Based Automatic Data Input Approach for Building Energy Performance Analysis

  • Kim, Karam;Yu, Jungho
    • Journal of Construction Engineering and Project Management
    • /
    • v.3 no.1
    • /
    • pp.14-21
    • /
    • 2013
  • To analyze building energy consumption, the building description for building energy performance analysis (BEPA) is required. The required data input for subject building is a basic step in the BEPA process. Since building information modeling (BIM) is applied in the construction industry, the required data for BEPA can be gathered from a single international standard file format like IFCXML. However, in most BEPA processes, since the required data cannot be fully used from the IFCXML file, a building description for BEPA must be created again. This paper proposes IFCXML-based automatic data input approach for BEA. After the required data for BEPA has been defined, automatic data input for BEPA is developed by a prototype system. To evaluate the proposed system, a common BIM file from the BuildingSMART website is applied as a sample model. This system can increase the efficiency and reliability of the BEPA process, since the data input is automatically and efficiently improved by directly using the IFCXML file..

IFCXML BASED AUTOMATIC DATA INPUT APPROACH FOR BUILDING ENERGY PERFORMANCE ANALYSIS

  • Ka-Ram Kim;Jung-Ho Yu
    • International conference on construction engineering and project management
    • /
    • 2013.01a
    • /
    • pp.173-180
    • /
    • 2013
  • To analyze building energy consumption, the building description for building energy performance analysis (BEPA) is required. The required data input for subject building is a basic step in the BEPA process. Since building information modeling (BIM) is applied in the construction industry, the required data for BEPA can be gathered from a single international standard file format like IFCXML. However, in most BEPA processes, since the required data cannot be fully used from the IFCXML file, a building description for BEPA must be created again. This paper proposes IFCXML-based automatic data input approach for BEA. After the required data for BEPA has been defined, automatic data input for BEPA is developed by a prototype system. To evaluate the proposed system, a common BIM file from the BuildingSMART website is applied as a sample model. This system can increase the efficiency and reliability of the BEPA process, since the data input is automatically and efficiently improved by directly using the IFCXML file.

  • PDF

Development of Neural-Networks-based Model for the Generation of an Earthquake Response Spectrum and a Design Spectrum (지진 응답 스펙트럼과 설계용 응답 스펙트럼 생성을 위한 신경망 모델의 개발)

  • 조빈아;이승창;한상환;이병해
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1998.10a
    • /
    • pp.447-454
    • /
    • 1998
  • The paper describes the second half of the research for the development of Neural-Networks-based model for the generation of an Artificial earthquake and a Response Spectrum(NNARS). Based on the redefined traditional processes related to the generation of an earthquake acceleration response spectrum and design spectrum, four neural-networks-based models are proposed to substitute the traditional processes. RS_NN tries to directly generate acceleration response spectrum with basic data that are magnitude, epicentral distance, site conditions and focal depth. The test results of RS_NN are not good because of the characteristics of white noise, which is randomly generated. ARS_NN solve this problem by the introduction of the average concept. IARS_NN has a role to inverse the ARS_NN, so that is applied to generate a ground motion accelerogram compatible with the shape of a response spectrum. Additionally, DS_NN directly produces design spectrum with basic data. As these four neural networks are simulated as a step by step, the paper describes the methods to generate a response spectrum and a design spectrum using the neural networks.

  • PDF