• Title/Summary/Keyword: XES format(XES)

Search Result 4, Processing Time 0.018 seconds

Workflow Process-Aware Data Cubes and Analysis (워크플로우 프로세스 기반 데이터 큐브 및 분석)

  • Jin, Min-hyuck;Kim, Kwang-hoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.19 no.6
    • /
    • pp.83-89
    • /
    • 2018
  • In workflow process intelligence and systems, workflow process mining and analysis issues are becoming increasingly important. In order to improve the quality of workflow process intelligence, it is essential for an efficient and effective data center storing workflow enactment event logs to be provisioned in carrying out the workflow process mining and analytics. In this paper, we propose a three-dimensional process-aware datacube for organizing workflow enterprise data centers to efficiently as well as effectively store the workflow process enactment event logs in the XES format. As a validation step, we carry out an experimental process mining to show how much perfectly the process-aware datacubes are suitable for discovering workflow process patterns and its analytical knowledge, like enacted proportions and enacted work transferences, from the workflow process enactment event histories. Finally, we confirmed that it is feasible to discover the fundamental control-flow patterns of workflow processes through the implemented workflow process mining system based on the process-aware data cube.

Defining and Discovering Cardinalities of the Temporal Workcases from XES-based Workflow Logs

  • Yun, Jaeyoung;Ahn, Hyun;Kim, Kwanghoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.20 no.3
    • /
    • pp.77-84
    • /
    • 2019
  • Workflow management system is a system that manages the workflow model which defines the process of work in reality. We can define the workflow process by sequencing jobs which is performed by the performers. Using the workflow management system, we can also analyze the flow of the process and revise it more efficiently. Many researches are focused on how to make the workflow process model more efficiently and manage it more easily. Recently, many researches use the workflow log files which are the execution history of the workflow process model performed by the workflow management system. Ourresearch group has many interests in making useful knowledge from the workflow event logs. In this paper we use XES log files because there are many data using this format. This papersuggests what are the cardinalities of the temporal workcases and how to get them from the workflow event logs. Cardinalities of the temporal workcases are the occurrence pattern of critical elements in the workflow process. We discover instance cardinalities, activity cardinalities and organizational resource cardinalities from several XES-based workflow event logs and visualize them. The instance cardinality defines the occurrence of the workflow process instances, the activity cardinality defines the occurrence of the activities and the organizational cardinality defines the occurrence of the organizational resources. From them, we expect to get many useful knowledge such as a patterns of the control flow of the process, frequently executed events, frequently working performer and etc. In further, we even expect to predict the original process model by only using the workflow event logs.

A MapReduce-Based Workflow BIG-Log Clustering Technique (맵리듀스기반 워크플로우 빅-로그 클러스터링 기법)

  • Jin, Min-Hyuck;Kim, Kwanghoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.20 no.1
    • /
    • pp.87-96
    • /
    • 2019
  • In this paper, we propose a MapReduce-supported clustering technique for collecting and classifying distributed workflow enactment event logs as a preprocessing tool. Especially, we would call the distributed workflow enactment event logs as Workflow BIG-Logs, because they are satisfied with as well as well-fitted to the 5V properties of BIG-Data like Volume, Velocity, Variety, Veracity and Value. The clustering technique we develop in this paper is intentionally devised for the preprocessing phase of a specific workflow process mining and analysis algorithm based upon the workflow BIG-Logs. In other words, It uses the Map-Reduce framework as a Workflow BIG-Logs processing platform, it supports the IEEE XES standard data format, and it is eventually dedicated for the preprocessing phase of the ${\rho}$-Algorithm that is a typical workflow process mining algorithm based on the structured information control nets. More precisely, The Workflow BIG-Logs can be classified into two types: of activity-based clustering patterns and performer-based clustering patterns, and we try to implement an activity-based clustering pattern algorithm based upon the Map-Reduce framework. Finally, we try to verify the proposed clustering technique by carrying out an experimental study on the workflow enactment event log dataset released by the BPI Challenges.

Disjunctive Process Patterns Refinement and Probability Extraction from Workflow Logs

  • Kim, Kyoungsook;Ham, Seonghun;Ahn, Hyun;Kim, Kwanghoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.20 no.3
    • /
    • pp.85-92
    • /
    • 2019
  • In this paper, we extract the quantitative relation data of activities from the workflow event log file recorded in the XES standard format and connect them to rediscover the workflow process model. Extract the workflow process patterns and proportions with the rediscovered model. There are four types of control-flow elements that should be used to extract workflow process patterns and portions with log files: linear (sequential) routing, disjunctive (selective) routing, conjunctive (parallel) routing, and iterative routing patterns. In this paper, we focus on four of the factors, disjunctive routing, and conjunctive path. A framework implemented by the authors' research group extracts and arranges the activity data from the log and converts the iteration of duplicate relationships into a quantitative value. Also, for accurate analysis, a parallel process is recorded in the log file based on execution time, and algorithms for finding and eliminating information distortion are designed and implemented. With these refined data, we rediscover the workflow process model following the relationship between the activities. This series of experiments are conducted using the Large Bank Transaction Process Model provided by 4TU and visualizes the experiment process and results.