• Title/Summary/Keyword: Data Modelling

Search Result 1,273, Processing Time 0.028 seconds

Towards a digital twin realization of the blade system design study wind turbine blade

  • Baldassarre, Alessandro;Ceruti, Alessandro;Valyou, Daniel N.;Marzocca, Pier
    • Wind and Structures
    • /
    • v.28 no.5
    • /
    • pp.271-284
    • /
    • 2019
  • This paper describes the application of a novel virtual prototyping methodology to wind turbine blade design. Numeric modelling data and experimental data about turbine blade geometry and structural/dynamical behaviour are combined to obtain an affordable digital twin model useful in reducing the undesirable uncertainties during the entire turbine lifecycle. Moreover, this model can be used to track and predict blade structural changes, due for example to structural damage, and to assess its remaining life. A new interactive and recursive process is proposed. It includes CAD geometry generation and finite element analyses, combined with experimental data gathered from the structural testing of a new generation wind turbine blade. The goal of the research is to show how the unique features of a complex wind turbine blade are considered in the virtual model updating process, fully exploiting the computational capabilities available to the designer in modern engineering. A composite Sandia National Laboratories Blade System Design Study (BSDS) turbine blade is used to exemplify the proposed process. Static, modal and fatigue experimental testing are conducted at Clarkson University Blade Test Facility. A digital model was created and updated to conform to all the information available from experimental testing. When an updated virtual digital model is available the performance of the blade during operation can be assessed with higher confidence.

Verification of OpenMC for fast reactor physics analysis with China experimental fast reactor start-up tests

  • Guo, Hui;Huo, Xingkai;Feng, Kuaiyuan;Gu, Hanyang
    • Nuclear Engineering and Technology
    • /
    • v.54 no.10
    • /
    • pp.3897-3908
    • /
    • 2022
  • High-fidelity nuclear data libraries and neutronics simulation tools are essential for the development of fast reactors. The IAEA coordinated research project on "Neutronics Benchmark of CEFR Start-Up Tests" offers valuable data for the qualification of nuclear data libraries and neutronics codes. This paper focuses on the verification and validation of the CEFR start-up modelling using OpenMC Monte-Carlo code against the experimental measurements. The OpenMC simulation results agree well with the measurements in criticality, control rod worth, sodium void reactivity, temperature reactivity, subassembly swap reactivity, and reaction distribution. In feedback coefficient evaluations, an additional state method shows high consistency with lower uncertainty. Among 122 relative errors in the benchmark of the distribution of nuclear reaction, 104 errors are less than 10% and 84 errors are less than 5%. The results demonstrate the high reliability of OpenMC for its application in fast reactor simulations. In the companion paper, the influence of cross-section libraries is investigated using neutronics modelling in this paper.

A Study on Efficient Technique of 3-D Terrain Modelling (3차원 지형모델링의 효율적 기법에 관한 연구)

  • 윤철규;신봉호;양승룡;엄재구
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.15 no.2
    • /
    • pp.207-213
    • /
    • 1997
  • The purpose of this study is to aim at presenting efficient technique of 3-D Terrain Modelling through multilateral approach methods and to compare with raw data, using low-densed randomly located point data. The subject religion of this study are selected two sites and take into consideration for degree of freedom about low-densed randomly located point data. The result of this study by precision analysis of digital cartographic map-ping using low-densed randomly located point data bave shown that . First, making digital cartographic map, the technique of making it using low-desned randomly located point data by TIN-based results to good and fast run-time in A and B sites all together. Second, the visualization analysis results of digital cartographic map using TIN and GRID-based terrain modeling techniqus similar exacts A and B sites, but the terrain modeling techniqus by TIN-based are small data size than GRID-based with the data with the data size of saving with DXF files. Third, making digital catographic map using terrain modeling techniques by Grid-based, the standard errors of low-densed randomly located point data and interpolated data using gridding method have more good results by radial basis function interpolation techniques at A and B sites all together.

  • PDF

Collaborative Modeling of Medical Image Segmentation Based on Blockchain Network

  • Yang Luo;Jing Peng;Hong Su;Tao Wu;Xi Wu
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.3
    • /
    • pp.958-979
    • /
    • 2023
  • Due to laws, regulations, privacy, etc., between 70-90 percent of providers do not share medical data, forming a "data island". It is essential to collaborate across multiple institutions without sharing patient data. Most existing methods adopt distributed learning and centralized federal architecture to solve this problem, but there are problems of resource heterogeneity and data heterogeneity in the practical application process. This paper proposes a collaborative deep learning modelling method based on the blockchain network. The training process uses encryption parameters to replace the original remote source data transmission to protect privacy. Hyperledger Fabric blockchain is adopted to realize that the parties are not restricted by the third-party authoritative verification end. To a certain extent, the distrust and single point of failure caused by the centralized system are avoided. The aggregation algorithm uses the FedProx algorithm to solve the problem of device heterogeneity and data heterogeneity. The experiments show that the maximum improvement of segmentation accuracy in the collaborative training mode proposed in this paper is 11.179% compared to local training. In the sequential training mode, the average accuracy improvement is greater than 7%. In the parallel training mode, the average accuracy improvement is greater than 8%. The experimental results show that the model proposed in this paper can solve the current problem of centralized modelling of multicenter data. In particular, it provides ideas to solve privacy protection and break "data silos", and protects all data.

A Study on the Information Modeling of Defense R&D Process Using IDEF Methodology (IDEF 방법론을 이용한 국방 연구개발 프로세스의 정보모델링 연구)

  • Kim, Chul-Whan
    • The Journal of Society for e-Business Studies
    • /
    • v.10 no.1
    • /
    • pp.41-60
    • /
    • 2005
  • IDEF(Integrated Definition) method, a standard methodology of CALS process modelling, was applied to the weapon system R&D process to provide information modelling by analysing about goal, input, output and constraints in the R&D process. The information to be managed in R&D institutes was identified by using SmartER which is the automation program of IDEF1/1X and obtained information modelling for TO-BE model. The work process of weapon system R&D consists of the concept study phase, the exploration development phase, the system development phase, the prototype manufacturing phase, and the report writing of R&D results phase. The information modelling of weapon system R&D is the R&D work process with information sharing by means of IWSDB Since IDEF is suitable for large scale system development like weapon system R&D, further studies on IDEF would be required to achieve the goal of defense CALS.

  • PDF

An Analytical Appraisal of Building Information Modelling (BIM) Guidelines to Identify Variations in the Procedures

  • Das, Dakshata;Moon, Sungkon
    • Journal of KIBIM
    • /
    • v.6 no.3
    • /
    • pp.1-14
    • /
    • 2016
  • The usage of Building Information Modelling (BIM) in building projects has enabled improvement in project planning, implementation and collaboration process amongst various stakeholders within architecture, engineering and construction (AEC) industry. However, variations exist in the current practices of BIM implementation and coordination process in the industry. These variations result in inconsistent degree of BIM use across the construction industry. This inconsistency gives rise to several managerial and technological challenges such as data interoperability issues and purposeful integration and exchange of information within the BIM components. In order to tackle the issue, it is essential to analyse the different BIM approaches employed by the industry practitioners. BIM guidelines serve as a critical link between the BIM model, and its subsequent execution. They therefore provide the best reflection of BIM application and processes. This research paper aims to address the variations existing in BIM practices across the construction industry. It includes an extensive study of 21 existing, publicly available BIM-based guidelines in order to establish an understanding of the present state of practice and deduce issues and concerns related to them. All guidelines analysed in this paper are first categorised based on authorship and the release date for efficient comparison. The points of similarity and difference between them are thereby realized and outlined. In addition, the transition of project implementation process from traditional methods to BIM technology is also explained. The existence of inconsistencies in the BIM guidelines reviewed in this paper reflects the need of a BIM 'Code Compliance Check'. The Code Compliance Check will serve as a regulatory project guideline that will further improve the potential of BIM by incorporating a consistent BIM modelling methodology for the entire construction industry.

Active vision을 이용한 곡면의 형상정보 획득 및 NC가공 시스템

  • 손영태;최영
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1992.04a
    • /
    • pp.256-261
    • /
    • 1992
  • Acquisition of 3D points is an essential process for modelling of physical 3D objects. Although Coordinate Measuring Machine(CMM) is most accurate for this purpose, it is very time consuming. To enhance the data aquisition speed for scuptured surfaces, active vision with reflecctometric method was used for our system. A fter the data acquisition, the system automatically generates cutting tool path for the 3-axis milling of the object. The fullyintegrated system from the data acquisition to the NC-code generation was implemented with IBN-PC/386 and necessary hardwears.

Application of GLIM to the Binary Categorical Data

  • Sok, Yong-U
    • Journal of the military operations research society of Korea
    • /
    • v.25 no.2
    • /
    • pp.158-169
    • /
    • 1999
  • This paper is concerned with the application of generalized linear interactive modelling(GLIM) to the binary categorical data. To analyze the categorical data given by a contingency table, finding a good-fitting loglinear model is commonly adopted. In the case of a contingency table with a response variable, we can fit a logit model to find a good-fitting loglinear model. For a given $2^4$ contingency table with a binary response variable, we show the process of fitting a loglinear model by fitting a logit model using GLIM and SAS and then we estimate parameters to interpret the nature of associations implied by the model.

  • PDF

The improvement of the operating process of sewage treatment plants in the upstream area of dam by MASS FLOWmodelling (MASS FLOW 모델링을 통한 댐상류지역의 공공하수처리시설 공정개선방안)

  • Lee, Hyunseop;Lee, Jiwon;Gil, Kyungik
    • Journal of Wetlands Research
    • /
    • v.22 no.2
    • /
    • pp.130-138
    • /
    • 2020
  • As of 2017, the sewerage penetration rate of Seoul and metropolitan cities is more than 90%, and the number of domestic sewage treatment plants increased by 25% from 3,064 in 2010 to 4,072 in 2017. Among them, sewage treatment plant operated by SBR system is 585, which is 17% higher than 2010. In order to improve the water quality of the water source and improve the operation of the small sewage facilities, the improvement of the process was studied by applying the modelling to 49 facilities of the sewage treatment plant in Andong Imha dam area with more than 500㎥/day 3 places and 46 places less than 500㎥/day. As an improvement plan for modelling, candidate data were derived by reviewing operation data for 5 years. 49 facilities are operated by 12 types of operating processes. Among them, 1 place mort than 500㎥/day with SBR method and 9 facilities with less than 500㎥/day were selected by dividing 46 sites into 3 types. As a result of applying modelling to more than 500㎥/day, it was possible to improve the quality of discharged water through SRT control and it was found that applying model to sites of small scale treatment plants can improve the removal efficiency of TP by up to 14.4%. As a result, the data of this study could be used to improve and improve the operation of sewage treatment plants and RCSTP(Rural Community Sewage Tratment Plant).

Design and Implementation of a Metadata System for Financial Information Data Modeling (금융정보 데이터 모델링을 위한 메타데이터 시스템의 설계 및 구현)

  • Cho, Sang-Hyuk
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.1
    • /
    • pp.81-85
    • /
    • 2012
  • As business environment and complex work conditions are rapidly changing, large financial institutions are doing research on various fields to build a system that will efficiently and accurately process the production and modification of financial information and minimize the error in data-processing. In this paper, we have built a metadata system that, among various research areas, gives stability, accuracy and convenience in financial data modelling, analyze its effect and when adapting new models, provide mapping information from existing model to efficiently connect models and databases. If we manage modelling and standard data through this metadata system, the data standardization and database can process the model modification work in an unitary system and consistent high quality data model can be maintained and managed when data modification occurs.