• Title/Summary/Keyword: analysis and modeling

Search Result 12,214, Processing Time 0.044 seconds

Prediction of Dynamic Power Consumption and IR Drop Analysis by efficient current modeling (효율적 전류모델을 이용한 고속의 전압 강하와 동적 파워 소모의 분석 기술)

  • Han, Sang-Yeol;Park, Sang-Jo;Lee, Yun-Sik
    • Journal of IKEEE
    • /
    • v.8 no.1 s.14
    • /
    • pp.63-72
    • /
    • 2004
  • The supply voltage has been drop rapidly and the total length of the wire increased exponentially in the nanometer SoC design environment. The ideal supply voltage was dropped sharply by the resistance and parasitic devices which stayed on the kilometers-long wire length. Even worse, it could severely affect the functional behavior of the block of the design. To analyze the effects of the long wire of the SoC while maintaining the accuracy, the modeling of the current and the RC conversion of the parasitic techniques are researched and applied. By these modeling and conversion, the multi-million gates HDTV Chipset can be analyzed within a day. The benchmark analysis of the HDTV SoC showed the superiority to the conventional methods in performance and accuracy.

  • PDF

Identifying research trends in the emergency medical technician field using topic modeling (토픽모델링을 활용한 응급구조사 관련 연구동향)

  • Lee, Jung Eun;Kim, Moo-Hyun
    • The Korean Journal of Emergency Medical Services
    • /
    • v.26 no.2
    • /
    • pp.19-35
    • /
    • 2022
  • Purpose: This study aimed to identify research topics in the emergency medical technician (EMT) field and examine research trends. Methods: In this study, 261 research papers published between January 2000 and May 2022 were collected, and EMT research topics and trends were analyzed using topic modeling techniques. This study used a text mining technique and was conducted using data collection flow, keyword preprocessing, and analysis. Keyword preprocessing and data analysis were done with the RStudio Version 4.0.0 program. Results: Keywords were derived through topic modeling analysis, and eight topics were ultimately identified: patient treatment, various roles, the performance of duties, cardiopulmonary resuscitation, triage systems, job stress, disaster management, and education programs. Conclusion: Based on the research results, it is believed that a study on the development and application of education programs that can successfully increase the emergency care capabilities of EMTs is needed.

Failure Analysis of Deteriorated Reinforced Concrete T-Girder Bridge Subject to Cyclic Loading (정적 반복하중을 받는 노후된 철근콘크리트 T형교의 파괴해석)

  • 송하원;송하원;변근주
    • Magazine of the Korea Concrete Institute
    • /
    • v.10 no.6
    • /
    • pp.291-301
    • /
    • 1998
  • In this paper, two dimensional and three dimentional modeling techniques are proposed for the failure analysis of deteriorated reinforced concrete T-girder bridge subjected to cyclic loading up to failure. For the nonlinear failure anaysis, a tension stiffening model which can consider degradation of bond between reinforcement and surrounding concrete due to corrision of rebars in old bridge is proposed and a modeling technique for the supports conditions of the bridges which can consider degradation of bearing at supports in old bridge is also proposed, The analysis results along with comparisons with full-scale failure-test results confirm that finite element modeling techniques in this paper can be well applied to the failure analyses of in-situ old reinforced concrete T-girder bridges subjected to cyclic loading and the support condition modeling especially affects the bridge strength significantly.

Design of Class Model Using Hierarchical Use Case Analysis for Object-Oriented Modeling (객체지향모델링 과정에서 계층적 유즈케이스(Use Case) 분석을 통한 클래스 도출 및 정의)

  • Lee, Jae-Woo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.10 no.12
    • /
    • pp.3668-3674
    • /
    • 2009
  • Use case diagram is used for defining inter-action between users and systems in object-oriented modeling. It is very important to defining users' requirements for efficient software development. In this paper, we propose a object-oriented modeling process using hierarchical use case analysis for designing class model. First, We define many use case diagrams by several hierarchical modeling level. And next, we can also design class model using the use case diagrams. Our proposed modeling process provides interaction between use case model and class model. That can make us to check the modeling process during the software development. Using the proposed object-oriented modeling we can develop software based on users' requirements. It is very useful for class modeling.

Systems Studies and Modeling of Advanced Life Support Systems

  • Kang, S.;Ting, K.C.;Both, A.J.
    • Agricultural and Biosystems Engineering
    • /
    • v.2 no.2
    • /
    • pp.41-49
    • /
    • 2001
  • Advanced Life Support Systems(ALSS) are being studied to support human life during long-duration space missions. ALSS can be categorized into four subsystems: Crew, Biomass Production, Food Processing and Nutrition, Waste Processing and Resource Recovery. The System Studies and Modeling (SSM) team of New Jersey-NASA Specialized Center of Research and Training (NJ-NSCORT) has facilitated and conducted analyses of ALSS to address systems level issues. The underlying concept of the SSM work is to enable the effective utilization of information to aid in planning, analysis, design, management, and operation of ALSS and their components. Analytical tools and computer models for ALSS analyses have been developed and implemented for value-added information processing. The results of analyses heave been delivered through the internet for effective communication within the advanced life support (ALS) community. Several modeling paradigms have been explored by developing tools for use in systems analysis. they include objected-oriented approach for top-level models, procedureal approach for process-level models, and application of commercially available modeling tools such as $MATLAB^{R}$/$Simulink^{R}$. Every paradigm has its particular applicability for the purpose of modeling work. an overview is presented of the systems studies and modeling work conducted by the NJ-NSCORT SSM team in its efforts to provide systems analysis capabilities to the ALS community. The experience gained and the analytical tools developed from this work can be extended to solving problems encountered in general agriculture.

  • PDF

Efficient Topic Modeling by Mapping Global and Local Topics (전역 토픽의 지역 매핑을 통한 효율적 토픽 모델링 방안)

  • Choi, Hochang;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.3
    • /
    • pp.69-94
    • /
    • 2017
  • Recently, increase of demand for big data analysis has been driving the vigorous development of related technologies and tools. In addition, development of IT and increased penetration rate of smart devices are producing a large amount of data. According to this phenomenon, data analysis technology is rapidly becoming popular. Also, attempts to acquire insights through data analysis have been continuously increasing. It means that the big data analysis will be more important in various industries for the foreseeable future. Big data analysis is generally performed by a small number of experts and delivered to each demander of analysis. However, increase of interest about big data analysis arouses activation of computer programming education and development of many programs for data analysis. Accordingly, the entry barriers of big data analysis are gradually lowering and data analysis technology being spread out. As the result, big data analysis is expected to be performed by demanders of analysis themselves. Along with this, interest about various unstructured data is continually increasing. Especially, a lot of attention is focused on using text data. Emergence of new platforms and techniques using the web bring about mass production of text data and active attempt to analyze text data. Furthermore, result of text analysis has been utilized in various fields. Text mining is a concept that embraces various theories and techniques for text analysis. Many text mining techniques are utilized in this field for various research purposes, topic modeling is one of the most widely used and studied. Topic modeling is a technique that extracts the major issues from a lot of documents, identifies the documents that correspond to each issue and provides identified documents as a cluster. It is evaluated as a very useful technique in that reflect the semantic elements of the document. Traditional topic modeling is based on the distribution of key terms across the entire document. Thus, it is essential to analyze the entire document at once to identify topic of each document. This condition causes a long time in analysis process when topic modeling is applied to a lot of documents. In addition, it has a scalability problem that is an exponential increase in the processing time with the increase of analysis objects. This problem is particularly noticeable when the documents are distributed across multiple systems or regions. To overcome these problems, divide and conquer approach can be applied to topic modeling. It means dividing a large number of documents into sub-units and deriving topics through repetition of topic modeling to each unit. This method can be used for topic modeling on a large number of documents with limited system resources, and can improve processing speed of topic modeling. It also can significantly reduce analysis time and cost through ability to analyze documents in each location or place without combining analysis object documents. However, despite many advantages, this method has two major problems. First, the relationship between local topics derived from each unit and global topics derived from entire document is unclear. It means that in each document, local topics can be identified, but global topics cannot be identified. Second, a method for measuring the accuracy of the proposed methodology should be established. That is to say, assuming that global topic is ideal answer, the difference in a local topic on a global topic needs to be measured. By those difficulties, the study in this method is not performed sufficiently, compare with other studies dealing with topic modeling. In this paper, we propose a topic modeling approach to solve the above two problems. First of all, we divide the entire document cluster(Global set) into sub-clusters(Local set), and generate the reduced entire document cluster(RGS, Reduced global set) that consist of delegated documents extracted from each local set. We try to solve the first problem by mapping RGS topics and local topics. Along with this, we verify the accuracy of the proposed methodology by detecting documents, whether to be discerned as the same topic at result of global and local set. Using 24,000 news articles, we conduct experiments to evaluate practical applicability of the proposed methodology. In addition, through additional experiment, we confirmed that the proposed methodology can provide similar results to the entire topic modeling. We also proposed a reasonable method for comparing the result of both methods.

A Case Study on Precise NURBS Modeling of Human Organs (인체장기의 정밀한 NURBS 곡면 모델링 사례연구)

  • Kim H.C.;Bae Y.H.;Soe T.W.;Lee S.H.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.915-918
    • /
    • 2005
  • Advances in Information Technology and in Biomedicine have created new uses for CAD technology with many novel and important biomedical applications. Such applications can be found, for example, in the design and modeling of orthopedics, medical implants, and tissue modeling in which CAD can be used to describe the morphology, heterogeneity, and organizational structure of tissue and anatomy. CAD has also played an important role in computer-aided tissue engineering for biomimetic design, analysis, simulation and freeform fabrication of tissue scaffolds and substitutes. And all the applications require precision geometry of the organs or bones of each patient. But the geometry information currently used is polygon model with none solid geometry and is so rough that it cannot be utilized for accurate analysis, simulation and fabrication. Therefore a case study is performed to deduce a transformation method to build free form surface from a rough polygon data or medical images currently used in the application. This paper describes the transformation procedure in detail and the considerations for accurate organ modeling are discussed.

  • PDF

Modeling of Beam Structures from Modal Parameters (모달 파라미터를 이용한 보 구조물의 모델링)

  • Hwang, Woo-Seok
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2006.11a
    • /
    • pp.519-522
    • /
    • 2006
  • Accurate modeling of a dynamic system from experimental data is the bases for the model updating or heath monitoring of the system. Modal analysis or modal test is a routine process to get the modal parameters of a dynamic system. The modal parameters include the natural frequencies, damping ratios and mode shapes. This paper presents a new method that can derive the equations of motion for a dynamic system from the modal parameters obtained by the modal analysis or modal test. The present method based on the relation between the eigenvalues and eigenvectors of the state space equation derives the mass, damping and stiffness matrices of the system. The modeling of a cantilevered beam from modal parameters is an example to prove the efficiency and accuracy of the present method. Using the lateral displacements only, not the rotations, gives limited information for the system. The numerical verification up to now gives reasonable results and the verification with the test data is scheduled.

  • PDF

Application of IFC Standard in Interoperability and Energy Analysis

  • Hyunjoo Kim;Zhenhua Shen
    • International conference on construction engineering and project management
    • /
    • 2013.01a
    • /
    • pp.87-93
    • /
    • 2013
  • In this research, a new methodology to perform building energy analysis using Industry Foundation Classes (IFC) standard has been studied. With the help of Archicad 14 modeling software, a 3D test model is generated and then exported to IFCXML format. A ruby code program retrieves the building information from the resulting IFCXML file using Nokogiri library. An INP file is created and gets ready for next energy analysis step. DOE 2.2 program analyzes the INP file and gives a detailed report of the energy cost of the building. Case study shows when using the IFC standard method, the Interoperability of the energy analysis is greatly improved. The main stream 3D building modeling software supports IFC standard. DOE 2.2 is able to read the INP file generated by IFC file. This means almost any 3D model created by main stream modeling software can be analyze in terms of energy cost Thus, IFC based energy analysis method has a promising future. With the development and application of IFC standard, designers can do more complex and easy-to-run energy analysis in a more efficient way.

  • PDF

Analysis of the Cognitive Level of Meta-modeling Knowledge Components of Science Gifted Students Through Modeling Practice (모델링 실천을 통한 과학 영재학생들의 메타모델링 지식 구성요소별 인식수준 분석)

  • Kihyang, Kim;Seoung-Hey, Paik
    • Journal of the Korean Chemical Society
    • /
    • v.67 no.1
    • /
    • pp.42-53
    • /
    • 2023
  • The purpose of this study is to obtain basic data for constructing a modeling practice program integrated with meta-modeling knowledge by analyzing the cognition level for each meta-modeling knowledge components through modeling practice in the context of the chemistry discipline content. A chemistry teacher conducted inquiry-based modeling practice including anomalous phenomena for 16 students in the second year of a science gifted school, and in order to analyze the cognition level for each of the three meta-modeling knowledge components such as model variability, model multiplicity, and modeling process, the inquiry notes recorded by the students and observation note recorded by the researcher were used for analysis. The recognition level was classified from 0 to 3 levels. As a result of the analysis, it was found that the cognition level of the modeling process was the highest and the cognition level of the multiplicity of the model was the lowest. The cause of the low recognitive level of model variability is closely related to students' perception of conceptual models as objective facts. The cause of the low cognitive level of model multiplicity has to do with the belief that there can only be one correct model for a given phenomenon. Students elaborated conceptual models using symbolic models such as chemical symbols, but lacked recognition of the importance of data interpretation affecting the entire modeling process. It is necessary to introduce preliminary activities that can explicitly guide the nature of the model, and guide the importance of data interpretation through specific examples. Training to consider and verify the acceptability of the proposed model from a different point of view than mine should be done through a modeling practice program.