• Title/Summary/Keyword: computational tools

Search Result 524, Processing Time 0.026 seconds

A Study on the Systems Engineering based Verification of a Systems Engineering Application Model for a LRT Project (경량전철사업 시스템엔지니어링 전산모델 검증에 관한 연구)

  • Han, Seok-Youn;Kim, Joo-Uk;Choi, Myung-Sung
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.17 no.7
    • /
    • pp.425-433
    • /
    • 2016
  • The construction of a light rail transit (LRT) system is a large and complex infrastructure project involving hundreds of billions of won in construction costs for a single route, and it is very important to carry out such a project from a life-cycle perspective because of its long-term operation. Systems engineering is a means and methodology to successfully implement customers' needs, and it is useful in large projects such as light rail transit. An application model called Systems Engineering for Light Rail Transit (SELRT) was developed to support systems engineering activities in light rail transit projects. In order to utilize SELRT, it is necessary to ensure that system requirements are met. As such, in this paper, we present a verification procedure and architecture based on a systems engineering-based methodology, thereby identifying the system requirements and deriving the verification requirements to confirm the SELRT model for the proposed method. The results show that the traceability of the system requirements and verification requirements, the verification method for each requirement, and the demonstration results for computerized tools are mutually connected, and that the initial requirements are clearly implemented in the SELRT. The proposed method is valid for verifying the SELRT, which can also be utilized in a LRT project.

Computational Optimization for RC Columns in Tall Buildings (초고층 철근콘크리트 기둥의 전산최적설계 프로세스)

  • Lee, Yunjae;Kim, Chee-Kyeong;Choi, Hyun-Chul
    • Journal of the Korea Concrete Institute
    • /
    • v.26 no.3
    • /
    • pp.401-409
    • /
    • 2014
  • This research develops tools and strategies for optimizing RC column sections applied in tall buildings. Optimization parameters are concrete strength and section shape, the objective function for which is subject to several predefined constraints drawn from the original structural design. For this purpose, we developed new components for StrAuto, a parametric modeling and optimization tool for building structure. The components receive from external analysis solvers member strengths calculated from the original design model, and output optimized column sections satisfying the minimum cost. Using these components, optimized sections are firstly obtained for each predefined concrete strength applied to the whole floors in the project building. The obtained results for each concrete strength are comparatively examined to determine the fittest sections which will also result in the fittest vertical zoning for concrete strength. The main optimization scenario for this is to search for the vertical levels where the identical optimized sections coincide for the two different concrete strengths in concern, and select those levels for the boundaries where a concrete strength will be changed to another. The optimization process provided in this research is a product of an intensive development designed for a specific member in a specific project. Thus, the algorithm suggested takes on a microscopic and mathematical approach. However, the technique has a lot of potential that it can further be extensively developed and applied for future projects.

Influence of Column Base Rigidity on Behavior of Steel Buildings (강구조물 지지부의 강성도가 구조물 거동에 미치는 영향)

  • 권민호;박문호;장준호;박순응
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.15 no.1
    • /
    • pp.165-172
    • /
    • 2002
  • Generally, the steel rigid frame has been analyzed using finite element analysis tools. While many efforts have been poured into the understanding and accurate prediction for the nonlinear behavior of the columns and beam-columns connections, the base of the columns are modeled as simply hinged or fixed. However, the base of the steel columns practically is neither fixed not hinged. It behaves as semi-rigid. In this paper, the supports of the columns we modeled as semi-rigid and the importance of such approach in moment-resisting columns is evaluated. Two typical buildings designed by the US specification are modeled and analyzed by the finite element based on stiffness method and flexibility method. The column bases of three-story buildings are modeled as rotational springs with a varying degree of stiffness and strength that simulates the semi-rigidity of the base. Depending on the degree of stiffness and strength, the semi-rigidity varies from the hinged to the fixed. Buildings with semi-rigid column bases behaves similarly to the building with fixed bases. It has been numerically observed through the pushover and nonlinear time history analyses that the decrease of the stiffness of the column base induces the rotational demand on the int air beams. an increase of rotation demands on the first store connections and lead to a soft-story mechanists Due often to the construction and environmental effects, undesired reduction of column base stiffness may cause an increase of rotation demands on the first store connections and lead to a soft-story mechanism.

Establishment of Rebar Quantity Estimation in BIM-based Initial Design Phase (BIM기반 초기 설계 단계 철근 물량 산출 프로세스 구축)

  • Song, Chi-Ho;Kim, Chee-Kyeong;Lee, Si Eun;Choi, Hyunchul
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.29 no.5
    • /
    • pp.447-454
    • /
    • 2016
  • In the meantime, looking at the present status of how to estimationte the quantity of rebar based on 3D BIM getting the limelight in these days, commercial BIM tools provide rebar modeling functions however it takes a vast amount of modeling time for modeling of rebar in use of that function hence there is no BIM software at present for practical use. Therefore, in this study, we organized and presented a practical rebar quantity estimationtion process in BIM-based design work-site and intended to develop a program named Rebar Automatic Arrangement Program - hereinafter called RAAP - which enables automatic rebar arrangement based on much more precise cross-sectional information of bars in column, beam, slab and wall than the one from existing 2D method under the conditions without any cross-sectional information in the initial design phase. In addition, we intended to establish rebar quantity estimationtion process in the initial design phase through interworking of modeling & quantity estimationtion functions in consideration of joint, anchoring length of BuilderHUB as a BIM software with RAAP. The results from this study are practical in developing a technology that is able to estimationte quantity with more improved reliability than the one from existing 2D-based methods with less effort when the quantity of framework is estimationted in the uncompleted state of cross-sectional design for structural members in the initial design phase of a construction project. And it is expected that it could be utilized as a basic study from which a reasonable quantity estimationtion program can be established in the initial design phase.

DNA Sequence Design using $\varepsilon$ -Multiobjective Evolutionary Algorithm ($\varepsilon$-다중목적함수 진화 알고리즘을 이용한 DNA 서열 디자인)

  • Shin Soo-Yong;Lee In-Hee;Zhang Byoung-Tak
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.12
    • /
    • pp.1217-1228
    • /
    • 2005
  • Recently, since DNA computing has been widely studied for various applications, DNA sequence design which is the most basic and important step for DNA computing has been highlighted. In previous works, DNA sequence design has been formulated as a multi-objective optimization task, and solved by elitist non-dominated sorting genetic algorithm (NSGA-II). However, NSGA-II needed lots of computational time. Therefore, we use an $\varepsilon$- multiobjective evolutionarv algorithm ($\varepsilon$-MOEA) to overcome the drawbacks of NSGA-II in this paper. To compare the performance of two algorithms in detail, we apply both algorithms to the DTLZ2 benchmark function. $\varepsilon$-MOEA outperformed NSGA-II in both convergence and diversity, $70\%$ and $73\%$ respectively. Especially, $\varepsilon$-MOEA finds optimal solutions using small computational time. Based on these results, we redesign the DNA sequences generated by the previous DNA sequence design tools and the DNA sequences for the 7-travelling salesman problem (TSP). The experimental results show that $\varepsilon$-MOEA outperforms the most cases. Especially, for 7-TSP, $\varepsilon$-MOEA achieves the comparative results two tines faster while finding $22\%$ improved diversity and $92\%$ improved convergence in final solutions using the same time.

Rough Computational Annotation and Hierarchical Conserved Area Viewing Tool for Genomes Using Multiple Relation Graph. (다중 관계 그래프를 이용한 유전체 보존영역의 계층적 시각화와 개략적 전사 annotation 도구)

  • Lee, Do-Hoon
    • Journal of Life Science
    • /
    • v.18 no.4
    • /
    • pp.565-571
    • /
    • 2008
  • Due to rapid development of bioinformatics technologies, various biological data have been produced in silico. So now days complicated and large scale biodata are used to accomplish requirement of researcher. Developing visualization and annotation tool using them is still hot issues although those have been studied for a decade. However, diversity and various requirements of users make us hard to develop general purpose tool. In this paper, I propose a novel system, Genome Viewer and Annotation tool (GenoVA), to annotate and visualize among genomes using known information and multiple relation graph. There are several multiple alignment tools but they lose conserved area for complexity of its constrains. The GenoVA extracts all associated information between all pair genomes by extending pairwise alignment. High frequency conserved area and high BLAST score make a block node of relation graph. To represent multiple relation graph, the system connects among associated block nodes. Also the system shows the known information, COG, gene and hierarchical path of block node. In this case, the system can annotates missed area and unknown gene by navigating the special block node's clustering. I experimented ten bacteria genomes for extracting the feature to visualize and annotate among them. GenoVA also supports simple and rough computational annotation of new genome.

Development And Applying Detailed Competencies For Elementary School Students' Data Collection, Analysis, and Representation (초등학생의 데이터 수집, 분석, 표현 수업을 위한 세부역량 개발 및 적용)

  • Suh, Woong;Ahn, Seongjin
    • Journal of The Korean Association of Information Education
    • /
    • v.23 no.2
    • /
    • pp.131-139
    • /
    • 2019
  • From 2019, software education has become a required subject for all elementary school students. However, many teachers are still unfamiliar with how the classes should be instructed. So this paper presented the meaning, detailed competencies and achievement standard in order to help in the collection, analysis and representation of data among the computational thinking that are key to software education. And it also suggested the applicability of the classes. The full course of the paper is summarized as follows. First, existing studies have summarized the meaning, detail and achievement standard of data related competencies. Based on this, a preliminary investigation was instructed. Pilot study carried out both FGI and closed questions at the same time. This was done in response to the survey's questionnaire reflecting the opinions of experts. Second, the results of the questionnaire generated as a result of the above were verified for validity, stability, and reliability among the PhD, PhD courses, software education teachers, and software education workers. Third, I developed and applied the five lessons as a class objective as 'Choosing collection method-Select the collection method according to the problem situation.', 'Searching for meaning of data-Understand what the analyzed data mean..', 'Using various expression methods-Use a variety of expression tools.' using the backward design model to integrate education, class, and assessment. As a result, the detailed competencies of data collection, analysis, and representation and achievement standard were presented. This may help in setting specific and specific criteria for what direction classes are recommended when planning data-related classes in elementary schools.

Study on Damage Information Management Plan for Maintenance and Operation of River Facilities (하천시설 유지운영을 위한 손상정보 관리방안 연구)

  • Joo, Jae-Ha;Nam, Jeung-Yong;Kim, Tae-Hyung
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.34 no.1
    • /
    • pp.9-18
    • /
    • 2021
  • Recently, the rapid proliferation, introduction, and application of the fourth industrial revolution technology has emerged as a trend in the construction market. Building Information Model (BIM) technology is a multidimensional information system that forms the basis of the fourth industrial revolution technology. The river sector utilizing this information-based system is also being actively reviewed, for example, the current measures for maintenance. In recent years, active research and current work should be done to reflect the need for river experts to introduce BIM into the river field. In addition, the development of tools and support software for establishing various information systems is essential for the activation of facility maintenance information systems reflecting advanced technology and to establish and operate management plans. A study on the maintenance of river facilities involves using existing drawings to build a three-dimensional (3D) information model, check the damage utilizing it, and inform it, and utilize it as the data for maintenance reinforcement. This study involved determining a method to build a river facility without the existing information system and using the property maintenance information with 3D modeling to provide a more effective and highly utilized management plan to check maintenance operations and manage damages.

The Effect of SW education based on Physical Computing on the Computational Thinking ability of elementary school students (피지컬 컴퓨팅 기반 소프트웨어 교육이 초등학생의 컴퓨팅 사고력에 미치는 영향)

  • Lee, Jaeho;Kim, SunHyang
    • Journal of Creative Information Culture
    • /
    • v.7 no.4
    • /
    • pp.243-255
    • /
    • 2021
  • The purpose of this study is to investigate the effect of software education based on physical computing on the CT ability of elementary school students. To this end, previous studies related to physical computing software education and software education in the 2015 revised curriculum were analyzed. In addition, COBL was selected among many physical computing tools on the market in consideration of the level and characteristics of learners in the school to conduct the study, and 'Professor Lee Jae-ho's AI Maker Coding with COBL' was used as the textbook. This study was conducted for 10 sessions on 135 students in 6 classes in 6th grade of H Elementary School located in Pyeongtaek, Gyeong gi-do. The results of this study are as follows. First, it was confirmed that physical computing software education linked to real life was effective in improving the CT ability of elementary school students. Second, the change in competency of CT ability by sector improved evenly from 8 to 30 points in the pre-score and post-score of computing thinking ability. Third, in this study, it was confirmed that 87% of students were very positive as a result of a survey of satisfaction with classes after real-life physical computing software education. We hope that follow-up studies will help select various regions across cities and rural areas, and prove that real-life physical computing software education for various learner members, including large and small schools, will help elementary school students improve their CT ability.

Evaluation of the CNESTEN's TRIGA Mark II research reactor physical parameters with TRIPOLI-4® and MCNP

  • H. Ghninou;A. Gruel;A. Lyoussi;C. Reynard-Carette;C. El Younoussi;B. El Bakkari;Y. Boulaich
    • Nuclear Engineering and Technology
    • /
    • v.55 no.12
    • /
    • pp.4447-4464
    • /
    • 2023
  • This paper focuses on the development of a new computational model of the CNESTEN's TRIGA Mark II research reactor using the 3D continuous energy Monte-Carlo code TRIPOLI-4 (T4). This new model was developed to assess neutronic simulations and determine quantities of interest such as kinetic parameters of the reactor, control rods worth, power peaking factors and neutron flux distributions. This model is also a key tool used to accurately design new experiments in the TRIGA reactor, to analyze these experiments and to carry out sensitivity and uncertainty studies. The geometry and materials data, as part of the MCNP reference model, were used to build the T4 model. In this regard, the differences between the two models are mainly due to mathematical approaches of both codes. Indeed, the study presented in this article is divided into two parts: the first part deals with the development and the validation of the T4 model. The results obtained with the T4 model were compared to the existing MCNP reference model and to the experimental results from the Final Safety Analysis Report (FSAR). Different core configurations were investigated via simulations to test the computational model reliability in predicting the physical parameters of the reactor. As a fairly good agreement among the results was deduced, it seems reasonable to assume that the T4 model can accurately reproduce the MCNP calculated values. The second part of this study is devoted to the sensitivity and uncertainty (S/U) studies that were carried out to quantify the nuclear data uncertainty in the multiplication factor keff. For that purpose, the T4 model was used to calculate the sensitivity profiles of the keff to the nuclear data. The integrated-sensitivities were compared to the results obtained from the previous works that were carried out with MCNP and SCALE-6.2 simulation tools and differences of less than 5% were obtained for most of these quantities except for the C-graphite sensitivities. Moreover, the nuclear data uncertainties in the keff were derived using the COMAC-V2.1 covariance matrices library and the calculated sensitivities. The results have shown that the total nuclear data uncertainty in the keff is around 585 pcm using the COMAC-V2.1. This study also demonstrates that the contribution of zirconium isotopes to the nuclear data uncertainty in the keff is not negligible and should be taken into account when performing S/U analysis.