• Title/Summary/Keyword: Software Process Infrastructure

Search Result 93, Processing Time 0.035 seconds

Variability Analysis Approach for Business Process Family Models (비즈니스 프로세스 패밀리 모델을 위한 가변성 분석 방법)

  • Moon, Mi-Kyeong;Yeom, Keun-Hyuk
    • The KIPS Transactions:PartD
    • /
    • v.15D no.5
    • /
    • pp.621-628
    • /
    • 2008
  • Many of today's businesses need IT system's flexibility for on-demand business which can be rapidly adapted to environment changes. Service oriented architecture (SOA) provides the infrastructure which makes business flexibility possible under the on-demand operating environment. Therefore, to satisfy these requirements, new approach for assuring business flexibility and enhancing reuse is needed. In this paper, we propose an approach for developing a business process family model (BPFM) in which the variabilities in business process family can be explicitly represented by using the variability analysis method of software product line. In addition, we describe the supporting tool for this approach. It can model the BPFM and generate automatically BPMs through decision and pruning process from BPFM. By using our approach, the business and its IT system can correspond to business environment changes rapidly and efficiently.

A Study on Similarity Calculation Method Between Research Infrastructure (국가연구시설장비의 유사도 판단기법에 관한 연구)

  • Kim, Yong Joo;Kim, Young Chan
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.7 no.12
    • /
    • pp.469-476
    • /
    • 2018
  • In order to jointly utilize research infrastructure and to build efficient construction, which are essential in science and technology research and development process. Although various classification methods have been introduced for efficient utilization of registered information, functions that can be directly utilized such as similar research infrastructure search is not yet been implemented due to limitations of collection information. In this study, we analyzed the similar search technique so far, presented the methodology for the calculation of similarity of research infrastructure, and analyzed the learning result. Study suggested that a technique can be use to extract meaningful keywords from information and analyze the similarity between the research infrastructure.

An Empirical Investigation of Vendor Readiness to Assess Offshore Software Maintenance Outsourcing Project

  • Ikram, Atif;Jalil, Masita Abdul;Ngah, Amir Bin;Khan, Ahmad Salman;Mahmood, Yasir
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.3
    • /
    • pp.229-235
    • /
    • 2022
  • The process of correcting, upgrading, and improving software products after they have been handed over to the consumer is known as software maintenance. Offshore software maintenance outsourcing (OSMO) clients benefit from cost savings, time savings, and improved quality software through OSMO. In most circumstances, the OSMO vendor makes a lot of money but not in all the cases. Especially, when the OSMO project offer is not properly assessed. An efficient outsourcing contract might yield successful outcomes for outsourced projects. But before sending a detailed proposal to bid on the OSMO project the vendor must have to assess the client's project (business offer) requirements. The purpose of this study is to find out common trends within the assessment of a OSMO project. A case study approach along with semi-structured interviews from eight companies concluded ten common practices and several roles. Among these practices, four (code structure, requirements, communication barriers and required infrastructure) were consistent amongst the responses .The findings, limitations and future work are discussed.

Understanding ISP Methodologies and Identifying Requirements of ISP-Supporting Software Tools

  • Kim, Sung-Kun
    • The Journal of Information Technology and Database
    • /
    • v.5 no.1
    • /
    • pp.51-67
    • /
    • 1998
  • There exist a number of information systems planning(ISP) methodologies. As the level of market competition gets intensified, firms are more likely to engage in organizational transformations such as BPR and CPI(continuous process improvement). Because this new requirement should be incorporated into ISP methodology, the number of ISP methodologies available will be continually increasing. However, we could not find a framework for understanding and classifying these divergent methodologies. So, we here present a framework for classifying ISP methodology classes. With this framework, we categorize different classes of ISP methodologies and identify their limitations in terms of missing elements and links. And we move on to present new technical innovations and other methodological advances that, if properly integrated with ISP methodologies, would help us derive an IT infrastructure plan more effectively. Furthermore, in search of software tools or aids supporting the application of ISP methodologies, we identify requirements of ISP-supporting software tools and evaluate functions of existing software tools, then suggesting a future direction to that end.

  • PDF

Development of GNSS Field Survey System for Effective Creation of Survey Result and Enhancement of User Convenience (효과적인 측량 성과물 작성 및 사용자 편의성 강화를 위한 GNSS 현장 측량시스템 개발)

  • Park, Joon Kyu;Kim, Min Gyu
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.35 no.3
    • /
    • pp.203-210
    • /
    • 2017
  • Korea has established an advanced infrastructure for real-time precise positioning such as CORS, virtual reference station service and perform continuous upgrading. However, in order to utilize the national infrastructure, it is necessary to process the acquired spatial information and take many steps to derive the final product. In addition, this process is highly dependent on foreign software. In this study, GNSS field survey system was developed and evaluation of its usability was performed. Real-time GNSS field survey system was developed and the system improves user̓s convenience and usability. The system was able to conduct survey effectively and produce the results. In addition, we compare the existing software with the survey performance to show the availability of the real-time GNSS surveying system. The system developed through the research can perform all the functions from real-time survey to the production of the outputs. It can create economical added value of the foreign software as a whole and simplify the work required for post-survey performance.

A design process of central stations for GNSS based land transportation infrastructure network (육상교통 사용자를 위한 위성항법기반 중앙국 시스템 설계 및 구현)

  • Son, Min-Hyuk;Kim, Gue-Heon;Heo, Moon-Bum
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2012.10a
    • /
    • pp.374-377
    • /
    • 2012
  • GNSS(Global Navigation Satellite System) based land transportation infrastructure system is consists of receiving station and central station. The functions of the central system include receiving station's data gathering and decoding, carrier correction and integrity information generated, transmission of data in real-time. In general, The central station architecture should take into account various important points relating to hardware/software of system, data archiving and checking, availability and continuity of operation, etc. There is a fundamental need for a generic design capable of being used in any situation. Also, There is need to develop an expandable and interoperable central station architecture. In this paper, the process of design and manufacture and verification will be introduced.

  • PDF

Computer modelling of fire consequences on road critical infrastructure - tunnels

  • Pribyl, Pavel;Pribyl, Ondrej;Michek, Jan
    • Structural Monitoring and Maintenance
    • /
    • v.5 no.3
    • /
    • pp.363-377
    • /
    • 2018
  • The proper functioning of critical points on transport infrastructure is decisive for the entire network. Tunnels and bridges certainly belong to the critical points of the surface transport network, both road and rail. Risk management should be a holistic and dynamic process throughout the entire life cycle. However, the level of risk is usually determined only during the design stage mainly due to the fact that it is a time-consuming and costly process. This paper presents a simplified quantitative risk analysis method that can be used any time during the decades of a tunnel's lifetime and can estimate the changing risks on a continuous basis and thus uncover hidden safety threats. The presented method is a decision support system for tunnel managers designed to preserve or even increase tunnel safety. The CAPITA method is a deterministic scenario-oriented risk analysis approach for assessment of mortality risks in road tunnels in case of the most dangerous situation - a fire. It is implemented through an advanced risk analysis CAPITA SW. Both, the method as well as the resulting software were developed by the authors' team. Unlike existing analyzes requiring specialized microsimulation tools for traffic flow, smoke propagation and evacuation modeling, the CAPITA contains comprehensive database with the results of thousands of simulations performed in advance for various combinations of variables. This approach significantly simplifies the overall complexity and thus enhances the usability of the resulting risk analysis. Additionally, it provides the decision makers with holistic view by providing not only on the expected risk but also on the risk's sensitivity to different variables. This allows the tunnel manager or another decision maker to estimate the primary change of risk whenever traffic conditions in the tunnel change and to see the dependencies to particular input variables.

Data Model Study for National Research Data Commons Service (국가연구데이터커먼즈 서비스를 위한 데이터모델 연구)

  • Cho, Minhee;Lee, Mikyoung;Song, Sa-kwang;Yim, Hyung-Jun
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.436-438
    • /
    • 2022
  • National Research Data Commons aims to build a system that can be used jointly by arranging analysis resources such as computing infrastructure, software, toolkit, API, and services used for data analysis together with research data to maximize the use of research data. do. The sharing and utilization system for publications and research data in the R&D process is well known. However, the environment in which data and tightly coupled software and computing infrastructure can be shared and utilized is insignificant and there is no management system. In this study, a data model is designed to systematically manage information on digital research resources required in the data-oriented R&D research process. This will be used to register and manage digital research resource information in the National Research Data Commons Service.

  • PDF

A Framework for Supporting RFID-enabled Business Processes Automation

  • Moon, Mi-Kyeing
    • Journal of information and communication convergence engineering
    • /
    • v.9 no.6
    • /
    • pp.712-720
    • /
    • 2011
  • Radio frequency identification (RFID) is an established technology and has the potential, in a variety of applications, to significantly reduce cost and improve performance. As RFID-enabled applications will fulfill similar tasks across a range of processes adapted to use the data gained from RFID tags, they can be considered as software products derived from a common infrastructure and assets that capture specific ions in the domain. This paper discusses a framework that supports the development of RFID-enabled applications based on a business process family model (BPFM), explicitly representing both commonalities and variabilities. To develop this framework, common activities are identified from RFID-enabled applications and the variabilities in the common activities are analyzed in detail using variation point concepts. Through this framework, RFID data is preprocessed, and thus, RFID-enabled applications can be developed without having to process RFID data. Sharing a common model and reusing assets to deploy recurrent services may be considered an advantage in terms of economic significance and the overall product quality afforded.

Data Server Oriented Computing Infrastructure for Process Integration and Multidisciplinary Design Optimization (다분야통합최적설계를 위한 데이터 서버 중심의 컴퓨팅 기반구조)

  • 홍은지;이세정;이재호;김승민
    • Korean Journal of Computational Design and Engineering
    • /
    • v.8 no.4
    • /
    • pp.231-242
    • /
    • 2003
  • Multidisciplinary Design Optimization (MDO) is an optimization technique considering simultaneously multiple disciplines such as dynamics, mechanics, structural analysis, thermal and fluid analysis and electromagnetic analysis. A software system enabling multidisciplinary design optimization is called MDO framework. An MDO framework provides an integrated and automated design environment that increases product quality and reliability, and decreases design cycle time and cost. The MDO framework also works as a common collaborative workspace for design experts on multiple disciplines. In this paper, we present the architecture for an MDO framework along with the requirement analysis for the framework. The requirement analysis has been performed through interviews of design experts in industry and thus we claim that it reflects the real needs in industry. The requirements include integrated design environment, friendly user interface, highly extensible open architecture, distributed design environment, application program interface, and efficient data management to handle massive design data. The resultant MDO framework is datasever-oriented and designed around a centralized data server for extensible and effective data exchange in a distributed design environment among multiple design tools and software.