• Title/Summary/Keyword: Automated workflow

Search Result 39, Processing Time 0.021 seconds

A Cost-Efficient Job Scheduling Algorithm in Cloud Resource Broker with Scalable VM Allocation Scheme (클라우드 자원 브로커에서 확장성 있는 가상 머신 할당 기법을 이용한 비용 적응형 작업 스케쥴링 알고리즘)

  • Ren, Ye;Kim, Seong-Hwan;Kang, Dong-Ki;Kim, Byung-Sang;Youn, Chan-Hyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.3
    • /
    • pp.137-148
    • /
    • 2012
  • Cloud service users request dedicated virtual computing resource from the cloud service provider to process jobs in independent environment from other users. To optimize this process with automated method, in this paper we proposed a framework for workflow scheduling in the cloud environment, in which the core component is the middleware called broker mediating the interaction between users and cloud service providers. To process jobs in on-demand and virtualized resources from cloud service providers, many papers propose scheduling algorithms that allocate jobs to virtual machines which are dedicated to one machine one job. With this method, the isolation of being processed jobs is guaranteed, but we can't use each resource to its fullest computing capacity with high efficiency in resource utilization. This paper therefore proposed a cost-efficient job scheduling algorithm which maximizes the utilization of managed resources with increasing the degree of multiprogramming to reduce the number of needed virtual machines; consequently we can save the cost for processing requests. We also consider the performance degradation in proposed scheme with thrashing and context switching. By evaluating the experimental results, we have shown that the proposed scheme has better cost-performance feature compared to an existing scheme.

A Framework of Intelligent Middleware for DNA Sequence Analysis in Cloud Computing Environment (DNA 서열 분석을 위한 클라우드 컴퓨팅 기반 지능형 미들웨어 설계)

  • Oh, Junseok;Lee, Yoonjae;Lee, Bong Gyou
    • Journal of Internet Computing and Services
    • /
    • v.15 no.1
    • /
    • pp.29-43
    • /
    • 2014
  • The development of NGS technologies, such as scientific workflows, has reduced the time required for decoding DNA sequences. Although the automated technologies change the genome sequence analysis environment, limited computing resources still pose problems for the analysis. Most scientific workflow systems are pre-built platforms and are highly complex because a lot of the functions are implemented into one system platform. It is also difficult to apply components of pre-built systems to a new system in the cloud environment. Cloud computing technologies can be applied to the systems to reduce analysis time and enable simultaneous analysis of massive DNA sequence data. Web service techniques are also introduced for improving the interoperability between DNA sequence analysis systems. The workflow-based middleware, which supports Web services, DBMS, and cloud computing, is proposed in this paper for expecting to reduceanalysis time and aiding lightweight virtual instances. It uses DBMS for managing the pipeline status and supporting the creation of lightweight virtual instances in the cloud environment. Also, the RESTful Web services with simple URI and XML contents are applied for improving the interoperability. The performance test of the system needs to be conducted by comparing results other developed DNA analysis services at the stabilization stage.

Management Automation Technique for Maintaining Performance of Machine Learning-Based Power Grid Condition Prediction Model (기계학습 기반 전력망 상태예측 모델 성능 유지관리 자동화 기법)

  • Lee, Haesung;Lee, Byunsung;Moon, Sangun;Kim, Junhyuk;Lee, Heysun
    • KEPCO Journal on Electric Power and Energy
    • /
    • v.6 no.4
    • /
    • pp.413-418
    • /
    • 2020
  • It is necessary to manage the prediction accuracy of the machine learning model to prevent the decrease in the performance of the grid network condition prediction model due to overfitting of the initial training data and to continuously utilize the prediction model in the field by maintaining the prediction accuracy. In this paper, we propose an automation technique for maintaining the performance of the model, which increases the accuracy and reliability of the prediction model by considering the characteristics of the power grid state data that constantly changes due to various factors, and enables quality maintenance at a level applicable to the field. The proposed technique modeled a series of tasks for maintaining the performance of the power grid condition prediction model through the application of the workflow management technology in the form of a workflow, and then automated it to make the work more efficient. In addition, the reliability of the performance result is secured by evaluating the performance of the prediction model taking into account both the degree of change in the statistical characteristics of the data and the level of generalization of the prediction, which has not been attempted in the existing technology. Through this, the accuracy of the prediction model is maintained at a certain level, and further new development of predictive models with excellent performance is possible. As a result, the proposed technique not only solves the problem of performance degradation of the predictive model, but also improves the field utilization of the condition prediction model in a complex power grid system.

Automated Composition System of Web Services by Semantic and Workflow based Hybrid Techniques (시맨틱과 워크플로우 혼합기법에 의한 자동화된 웹 서비스 조합시스템)

  • Lee, Yong-Ju
    • The KIPS Transactions:PartD
    • /
    • v.14D no.2
    • /
    • pp.265-272
    • /
    • 2007
  • In this paper, we implement an automated composition system of web services using hybrid techniques that merge the benefit of BPEL techniques, with the advantage of OWL-S, BPEL techniques have practical capabilities that fulfil the needs of the business environment such as fault handling and transaction management. However, the main shortcoming of these techniques is the static composition approach, where the service selection and flow management are done a priori and manually. In contrast, OWL-S techniques use ontologies to provide a mechanism to describe the web services functionality in machine-understandable form, making it possible to discover, and integrate web services automatically. This allows for the dynamic integration of compatible web services, possibly discovered at run time, into the composition schema. However, the development of these approaches is still in its infancy and has been largely detached from the BPEL composition effort. In this work, we describe the design of the SemanticBPEL architecture that is a hybrid system of BPEL4WS and OWL-S, and propose algorithms for web service search and integration. In particular, the SemanticBPEL has been implemented based on the open source tools. The proposed system is compared with existing BPEL systems by functional analysis. These comparisions show that our system outperforms existing systems.

Automated Supervision of Data Production - Managing the Creation of Statistical Reports on Periodic Data

  • Schanzenberger, Anja;Lawrence, D.R.
    • 한국디지털정책학회:학술대회논문집
    • /
    • 2004.11a
    • /
    • pp.39-53
    • /
    • 2004
  • Data production systems are generally very large, distributed and complex systems used for creating advanced (mainly statistical) reports. Typically, data is gathered periodically and then subsequently aggregated and separated during numerous production steps. These production steps are arranged in a specific sequence (workflow or production chain), and can be located worldwide. Today, a need for improving and automating methods of supervision for data production systems has been recognized. Supervision in this context entails planning, monitoring and controlling data production. Two significant approaches are introduced here for improving this supervision. The first is a 'closely-coupledd' approach (meaning direct communication between production jobs and supervisory tool, informing the supervisory tod immediately about delays in production) - based upon traditional production planning methods typically used for manufacturing (goods) and adopted for working with data production. The second is a 'loosely-coupled' approach (meaning no direct communication between supervisory tool and production jobs is used) - having its origins in proven traditional project management. The supervisory tool just enquires continuously the progress of production. In both cases, dates, costs, resources, and system health information is made available to management. production operators and administrators to support a timely and smooth production of periodic data. Both approaches are theoretically described and compared. The main finding is that, both are useful, but in different cases. The main advantages of the closely coupled approach are the large production optimisation potential and a production overview in form of a job execution plan, whereas the loosely coupled method mainly supports unhindered job execution and offers a sophisticated production overview in form of a milestone schedule. Ideas for further research include investigation of other potential approaches and theoretical and practical comparison.

  • PDF

Implementation of Microbial Identification Query System for Laboratory Medicine (진단검사의학을 위한 세균동정 쿼리시스템의 구현)

  • Koo Bong Oh;Shin Yong Won
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.1 s.33
    • /
    • pp.113-124
    • /
    • 2005
  • The work of investigation in the laboratory medicine includes various kinds of investigations and microbes and it is too complicated to draw needed results in time. So, we aim to improve work performance of the laboratory medicine. For this study, we implemented the scheduling system in microbe investigation using agent environment and the workflow management system to manage the schedule of investigation, and the query system to check the schedule. And preliminary report and final report of microbe investigation can be announced automatically using agent. The scheduling system implemented could identify the lack or waste of resources and thus enable efficient management and distribution of resources. The query system could check the schedule and retrieve the Processing status in short time, enabled the automated report, and reduced possible interrupts and the delay of work that can be occurred in confirming process. It also enables users to access from local and remote sites. Also, this system can reduce the conflicts among People that may occur in unexpected situations because it enables doctors to confirm those situations such as the malfunction of instrument and the lack of agar or reagent, and the efficiency of work process can be expected.

  • PDF

Improvement Strategies for the Management and Status Inspection System of National Survey Control Point Markers (국가 측량기준점 표지 관리 및 조사체계의 개선방안)

  • Min, Kwan-Sik
    • Journal of Cadastre & Land InformatiX
    • /
    • v.53 no.2
    • /
    • pp.93-106
    • /
    • 2023
  • The objective of this study is to improve the survey rate and reliability of survey control point marks, which serve as the foundation for location-based services. Within the scope of this research, we have inspected the current status and reporting systems for national control point marks, public control point marks, and cadastral control point marks, identifying various issues. To tackle these challenges, we have prepared proposals for new unit cost calculations for national control point mark inspections, enhancements to the inspection and reporting systems for survey control point marks, and preliminary reporting procedures for surveyors using these control point marks. Additionally, we have established a workflow to refine the process of survey control point mark inspections and have proposed an automated system for calculating the unit cost for unreported targets. The study anticipates that these changes will improve the efficiency of future survey control point mark inspection tasks.

Preliminary Test of Google Vertex Artificial Intelligence in Root Dental X-ray Imaging Diagnosis (구글 버텍스 AI을 이용한 치과 X선 영상진단 유용성 평가)

  • Hyun-Ja Jeong
    • Journal of the Korean Society of Radiology
    • /
    • v.18 no.3
    • /
    • pp.267-273
    • /
    • 2024
  • Using a cloud-based vertex AI platform that can develop an artificial intelligence learning model without coding, this study easily developed an artificial intelligence learning model by the non-professional general public and confirmed its clinical applicability. Nine dental diseases and 2,999 root disease X-ray images released on the Kaggle site were used for the learning data, and learning, verification, and test data images were randomly classified. Image classification and multi-label learning were performed through hyper-parameter tuning work using a learning pipeline in vertex AI's basic learning model workflow. As a result of performing AutoML(Automated Machine Learning), AUC(Area Under Curve) was found to be 0.967, precision was 95.6%, and reproduction rate was 95.2%. It was confirmed that the learned artificial intelligence model was sufficient for clinical diagnosis.

A Case Study in Applying Hyperautomation Platform for E2E Business Process Automation (E2E 비즈니스 프로세스 자동화를 위한 하이퍼오토메이션 플랫폼 적용방안 및 사례연구)

  • Cheonsu Jeong
    • Information Systems Review
    • /
    • v.25 no.2
    • /
    • pp.31-56
    • /
    • 2023
  • As the COVID-19 pandemic is prolonged, non-contact work has increased, as well as the demand for automation of simple and repetitive questions and tasks with success of using them. Therefore, companies are attempting to expand the area of automated business and apply various technologies such as AI to complex and various business processes of E2E to provide automation of all business. However, the extension to Intelligent Process Automation (IPA) is still in its beginning stage so that it is difficult to find practical use cases and related solutions. In this aspect, it is safe to say that there is insufficient evidence for companies which have various and complex enterprise processes to make a decision about the adoption. In this study, to solve this problem, a Hyper Automation Platform (HAP) that consists of RPA, Chatbot, and AI technology was proposed. Moreover, an implementation method that can bring intelligent process automation using HAP, and practical use-cases were provided so that it makes it possible to review the implementation of the HAP objectively and comprehensively. This study is meaningful and valuable to check the feasibility of the Hyper Automation concept and to actively utilize HAP.