• Title/Summary/Keyword: Data-Integration

Search Result 3,496, Processing Time 0.042 seconds

Design of a loosely-coupled GPS/INS integration system (약결합 방식의 GPS/INS 통합시스템 설계)

  • 김종혁;문승욱;김세환;황동환;이상정;오문수;나성웅
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.2 no.2
    • /
    • pp.186-196
    • /
    • 1999
  • The CPS provides data with long-term stability independent of passed time and the INS provides high-rate data with short-term stability. By integrating these complementary systems, a highly accurate navigation system can be achieved. In this paper, a loosely-coupled GPS/INS integration system is designed. It is a simple structure and is easy to implement and preserves independent navigation capability of GPS and INS. The integration system consists of a NCU, an IMU, a GPS receiver, and a monitoring system. The navigation algorithm in the NCU is designed under the multi-tasking environment based on a real-time kernel system and the monitoring system is designed using the Visual C++. The integrated Kalman filter is designed as a feedback formed 15-state filter, in which the states are position errors, velocity errors, attitude errors and sensor bias errors. The van test result shows that the integrated system provides more accurate navigation solution then the inertial or the GPS-alone navigation system.

  • PDF

Simulation of Regional Climate over East Asia using Dynamical Downscaling Method

  • Oh, Jai-Ho;Kim, Tae-Kook;Min, Young-Mi
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2002.05b
    • /
    • pp.1187-1194
    • /
    • 2002
  • In this study, we have simulated regional climate over East Asia using dynamical downscaling For dynamic downscaling experiments for regional climate simulation, MM5method. with 27 km horizontal resolution and 18 layers of sigma-coordinate in vertical is nested within global-scale NCEP reanalysis data with 2.5。${\times}$2.5。 resolution in longitude and latitude. In regional simulation, January and July, 1979 monthly mean features have been obtained by both continuous integration and daily restart integration driven by updating the lateral boundary forcing at 6-hr intervals from the NCEP reanalysis data using a nudging scheme with the updating design of initial and boundary conditions in both continuous and restart integrations. In result, we may successfully generated regional detail features which might be forced by topography, lake, coastlines and land use distribution from a regional climate. There is no significant difference in monthly mean features either integrate continuously or integrate with daily restart. For climatologically long integration, the initial condition may not be significantly important. Accordingly, MM5 can be integrated for a long period without restart frequently, if a proper lateral boundary forcing is given.

  • PDF

An Empirical Analysis on the Long-term Balance of Bunker Oil Prices Using the Co-integration Model and Vector Error Correction Model (공적분·벡터오차수정모형을 활용한 벙커유 가격의 장기균형 수렴에 관한 실증분석)

  • Ahn, Young-Gyun;Lee, Min-Kyu
    • Korea Trade Review
    • /
    • v.44 no.1
    • /
    • pp.75-86
    • /
    • 2019
  • This study performs a factor analysis that affects the bunker oil price using the Co-integration model and Vector Error Correction Model (VECM). For this purpose, we use data from Clarkson and the analysis results show 17.6% decrease in bunker oil price when the amount of crude oil production increases at 1.0%, 10.3% increase in bunker oil price when the seaborne trade volume increases at 1.0%, 1.0% decrease in bunker oil price when total volume of vessels increases at 1.0%, and 0.003% increase in bunker oil price when 1.0% increase in world GDP, respectively. This study is meaningful in that this study estimates the speed of convergence to long-term equilibrium and identifies the price adjust mechanism which naturally exists in bunker oil market. And it is expected that the future study can provide statistically more meaningful econometric results if it can obtain data during more long-periods and use more various kinds of explanatory variables.

Planning Directions of Community Facilities Integrating Generations based on Local Communities

  • Jae Hee CHUNG;Ji Min KIM;Su Jin LEE;Sung Ze YI
    • The Journal of Economics, Marketing and Management
    • /
    • v.12 no.1
    • /
    • pp.39-51
    • /
    • 2024
  • Purpose: This study aims to derive planning directions of community facilities integrating generations based on local communities to promote sustainable intergenerational exchange by analyzing the spatial configuration and programs of domestic and foreign generation-integrated community facilities based on local communities. Research design, data and methodology: Through theoretical consideration, the concept of intergenerational integration, types of intergenerational exchange, and spatial arrangement types were identified. Then, case study analysis of domestic and foreign community facilities with well-planned intergenerational exchange spaces and programs were conducted to identify intergenerational integration, and to derive community facility planning direction. Results: The results of this research are as follows. First, in terms of humanware, in order to revitalize continuous exchange between the 1st, 2nd, and 3rd generations, a systematic support system is needed to build mutual trust through voluntary participation by each generation. Second, it is important to provide a variety of shared spaces while maintaining the uniqueness of each facility from a hardware perspective, and must be planned in such a way that selective interaction takes place with privacy and interaction in mind. Third, in terms of software, programs that meet the characteristics of each user must be provided. Conclusions: It is expected that the results of this research can be used as basic data for planning community facilities that integrate generations based on local communities, contributing to the search for sustainable ways to revitalize intergenerational exchange in the future.

A Case Study of Rapid AI Service Deployment - Iris Classification System

  • Yonghee LEE
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.4
    • /
    • pp.29-34
    • /
    • 2023
  • The flow from developing a machine learning model to deploying it in a production environment suffers challenges. Efficient and reliable deployment is critical for realizing the true value of machine learning models. Bridging this gap between development and publication has become a pivotal concern in the machine learning community. FastAPI, a modern and fast web framework for building APIs with Python, has gained substantial popularity for its speed, ease of use, and asynchronous capabilities. This paper focused on leveraging FastAPI for deploying machine learning models, addressing the potentials associated with integration, scalability, and performance in a production setting. In this work, we explored the seamless integration of machine learning models into FastAPI applications, enabling real-time predictions and showing a possibility of scaling up for a more diverse range of use cases. We discussed the intricacies of integrating popular machine learning frameworks with FastAPI, ensuring smooth interactions between data processing, model inference, and API responses. This study focused on elucidating the integration of machine learning models into production environments using FastAPI, exploring its capabilities, features, and best practices. We delved into the potential of FastAPI in providing a robust and efficient solution for deploying machine learning systems, handling real-time predictions, managing input/output data, and ensuring optimal performance and reliability.

From proteomics toward systems biology: integration of different types of proteomics data into network models

  • Rho, Sang-Chul;You, Sung-Yong;Kim, Yong-Soo;Hwang, Dae-Hee
    • BMB Reports
    • /
    • v.41 no.3
    • /
    • pp.184-193
    • /
    • 2008
  • Living organisms are comprised of various systems at different levels, i.e., organs, tissues, and cells. Each system carries out its diverse functions in response to environmental and genetic perturbations, by utilizing biological networks, in which nodal components, such as, DNA, mRNAs, proteins, and metabolites, closely interact with each other. Systems biology investigates such systems by producing comprehensive global data that represent different levels of biological information, i.e., at the DNA, mRNA, protein, or metabolite levels, and by integrating this data into network models that generate coherent hypotheses for given biological situations. This review presents a systems biology framework, called the 'Integrative Proteomics Data Analysis Pipeline' (IPDAP), which generates mechanistic hypotheses from network models reconstructed by integrating diverse types of proteomic data generated by mass spectrometry-based proteomic analyses. The devised framework includes a serial set of computational and network analysis tools. Here, we demonstrate its functionalities by applying these tools to several conceptual examples.

Data Server Oriented Computing Infrastructure for Process Integration and Multidisciplinary Design Optimization (다분야통합최적설계를 위한 데이터 서버 중심의 컴퓨팅 기반구조)

  • 홍은지;이세정;이재호;김승민
    • Korean Journal of Computational Design and Engineering
    • /
    • v.8 no.4
    • /
    • pp.231-242
    • /
    • 2003
  • Multidisciplinary Design Optimization (MDO) is an optimization technique considering simultaneously multiple disciplines such as dynamics, mechanics, structural analysis, thermal and fluid analysis and electromagnetic analysis. A software system enabling multidisciplinary design optimization is called MDO framework. An MDO framework provides an integrated and automated design environment that increases product quality and reliability, and decreases design cycle time and cost. The MDO framework also works as a common collaborative workspace for design experts on multiple disciplines. In this paper, we present the architecture for an MDO framework along with the requirement analysis for the framework. The requirement analysis has been performed through interviews of design experts in industry and thus we claim that it reflects the real needs in industry. The requirements include integrated design environment, friendly user interface, highly extensible open architecture, distributed design environment, application program interface, and efficient data management to handle massive design data. The resultant MDO framework is datasever-oriented and designed around a centralized data server for extensible and effective data exchange in a distributed design environment among multiple design tools and software.

Development of a Design Information Sharing System Using Network and STEP (네트워크와 STEP 표준을 이용한 설계 정보 공유 시스템의 개발)

  • Cho, Sung-Wook;Choi, Young;Kwon, Ki-Eok;Park, Myung-Jin;Yang, Sang-Wook
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.15 no.9
    • /
    • pp.82-92
    • /
    • 1998
  • An international standard for the product model data, STEP, and a standard for the distributed object technology, CORBA, will play a very important role in the future manufacturing environment. These two technologies provide background for the sharing of product data and the integration of applications on the network. This paper describes a prototype CAD/CAE environment that is integrated on the network by STEP and CORBA. Several application servers and client software were developed to verify the proposed concept. The present CAD/CAE environments are composed of several individual software components which are not tightly integrated. They also do not utilize the rapidly expanding network and object technologies for the collaboration in the product design process. In the design process in a large organization, sharing of application resources, design data and analysis data through the network will greatly enhance the productivity. The integration between applications can be supported by two key technologies, CORBA(Common Object Request Broker Architecture) and STEP(Standard for the Exchange of Product Model Bata). The CORBA provides interoperability between applications on different machines in heterogeneous distributed environments and seamlessly interconnects distributed object systems. Moreover, if all the data in the CAD/CAE environment are based on the STEP, then we can exclude all the data conversion problems between the application systems.

  • PDF

On the Integration of Systems Design and Systems Safety Processes from an Integrated Data Model Viewpoint (데이터모델 관점에서의 시스템설계 및 시스템안전 프로세스의 통합에 관한 연구)

  • Kim, Young-Min;Lee, Jae-Chon
    • Journal of the Korea Safety Management & Science
    • /
    • v.14 no.4
    • /
    • pp.107-116
    • /
    • 2012
  • The issues raised so far in the development of safety-critical systems have centered on how effectively the safety requirements are met in systems design. The systems are becoming more complex due to the increasing demand on the functionality and performance. As such, the integration of both the systems design and systems safety processes becomes more important and at the same time quite difficult to carry out. In this paper, an approach to solving the problem is presented, which is based on an integrated data model. To do so, the data generated from the inputs and outputs of the systems design and systems safety processes are analyzed first. The results of analysis are used to extract common attributes among the data, thereby making it possible to define classes. The classes then become the cores of the interface data model through which the interaction between the two processes under study can be modeled and interpreted. The approach taken has also been applied in a design case to demonstrate its value. It is expected that the results of the study could play a role of the stepping stone in extending to the architecture development of the integrated process.

Multilingual Knowledge Graphs: Challenges and Opportunities

  • Partha Sarathi Mandal;Sukumar Mandal
    • International Journal of Knowledge Content Development & Technology
    • /
    • v.14 no.4
    • /
    • pp.101-111
    • /
    • 2024
  • Multilingual Knowledge Graphs (MKGs) have emerged as a crucial component in various natural language processing tasks, enabling efficient representation and utilization of structured knowledge across multiple languages. One can get data, information, and knowledge from various sectors, like libraries, archives, institutional repositories, etc. Variable quality of metadata, multilingualism, and semantic diversity make it a challenge to create a digital library and multilingual search facility. To accept these challenges, there is a need to design a framework to integrate various structured and unstructured data sources for integration, unification, and sharing databases. These are controlled using linked data and semantic web approaches. In future, multilingual knowledge graph overcomes all the linguistic nuances, technical barriers like semantic interoperability, data harmonization etc and enhance cooperation and collaboration throughout the world. Through a comprehensive analysis of the current state-of-the-art techniques and ongoing research efforts, this paper aims to offer insights into the future directions and potential advancements in the field of Multilingual Knowledge Graphs. This paper deals with a multilingual knowledge graph and how to build up a multilingual knowledge graph. It also focuses on the various challenges and opportunities for designing multilingual knowledge graphs.