• Title/Summary/Keyword: workflow model

Search Result 215, Processing Time 0.028 seconds

An Adaptive Business Process Mining Algorithm based on Modified FP-Tree (변형된 FP-트리 기반의 적응형 비즈니스 프로세스 마이닝 알고리즘)

  • Kim, Gun-Woo;Lee, Seung-Hoon;Kim, Jae-Hyung;Seo, Hye-Myung;Son, Jin-Hyun
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.3
    • /
    • pp.301-315
    • /
    • 2010
  • Recently, competition between companies has intensified and so has the necessity of creating a new business value inventions has increased. A numbers of Business organizations are beginning to realize the importance of business process management. Processes however can often not go the way they were initially designed or non-efficient performance process model could be designed. This can be due to a lack of cooperation and understanding between business analysts and system developers. To solve this problem, business process mining which can be used as the basis of the business process re-engineering has been recognized to an important concept. Current process mining research has only focused their attention on extracting workflow-based process model from competed process logs. Thus there have a limitations in expressing various forms of business processes. The disadvantage in this method is process discovering time and log scanning time in itself take a considerable amount of time. This is due to the re-scanning of the process logs with each new update. In this paper, we will presents a modified FP-Tree algorithm for FP-Tree based business processes, which are used for association analysis in data mining. Our modified algorithm supports the discovery of the appropriate level of process model according to the user's need without re-scanning the entire process logs during updated.

Estimation of Mechanical Representative Elementary Volume and Deformability for Cretaceous Granitic Rock Mass: A Case Study of the Gyeongsang Basin, Korea (경상분지 백악기 화강암 암반에 대한 역학적 REV 및 변형특성 추정사례)

  • Um, Jeong-Gi;Ryu, Seongjin
    • The Journal of Engineering Geology
    • /
    • v.32 no.1
    • /
    • pp.59-72
    • /
    • 2022
  • This study employed a 3-D numerical analysis based on the distinct element method to estimate the strength and deformability of a Cretaceous biotite granitic rock mass at Gijang, Busan, Korea. A workflow was proposed to evaluate the scale effect and the representative elementary volume (REV) of mechanical properties for fractured rock masses. Directional strength and deformability parameters such as block strength, deformation modulus, shear modulus, and bulk modulus were estimated for a discrete fracture network (DFN) in a cubic block the size of the REV. The size of the mechanical REV for fractured rock masses in the study area was determined to be a 15 m cube. The mean block strength and mean deformation modulus of the DFN cube block were found to be 52.8% and 57.7% of the intact rock's strength and Young's modulus, respectively. A constitutive model was derived for the study area that describes the linear-elastic and orthotropic mechanical behavior of the rock mass. The model is expected to help evaluate the stability of tunnels and underground spaces through equivalent continuum analysis.

A Study on System Requirements for Integrated Electronic Document Management System (IEDMS) (통합전자문서체계구현을 위한 요구기능 분석 연구 -A사의 전자문서관리 사례를 중심으로-)

  • 권택문
    • Journal of Information Technology Application
    • /
    • v.2 no.1
    • /
    • pp.55-81
    • /
    • 2000
  • An Electronic Document Management System(EDMS) is an electronic system solution that is used to create, capture, distribute, edit, store and manage documents and related structured data repositories throughout an organization. Recently, documents of any type, such as text, images, and video files, and structured databases can be controlled and managed by an office automation system and an EDMS. Thus, many organizations are already using these information technologies to reduce process cycle-times. But what the organizations are missing is a integrated system the current workflow or office automation system and provides immediate access to and automatic routing of the organization's mission-critical information. This study tried to find out the user's requirements for integrating current information system and relatively new technology, electronic document management system in order to improve business operations, productivity and quality, and reduces waste. integration of electronic document management system(EDMS) and office automation system and proper use of these technological will improve organization's processes, and compress the process cycle-times. For this study a case study was done by a project team in cooperation with a government organization(say A company). Through this case study valuable electronic document management and office automation system requirement have been identified and reported for providing a system model in order to design an Integrated EDMS(IMDMS).

  • PDF

A demonstration of the H3 trimethylation ChIP-seq analysis of galline follicular mesenchymal cells and male germ cells

  • Chokeshaiusaha, Kaj;Puthier, Denis;Nguyen, Catherine;Sananmuang, Thanida
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.31 no.6
    • /
    • pp.791-797
    • /
    • 2018
  • Objective: Trimethylation of histone 3 (H3) at 4th lysine N-termini (H3K4me3) in gene promoter region was the universal marker of active genes specific to cell lineage. On the contrary, coexistence of trimethylation at 27th lysine (H3K27me3) in the same loci-the bivalent H3K4m3/H3K27me3 was known to suspend the gene transcription in germ cells, and could also be inherited to the developed stem cell. In galline species, throughout example of H3K4m3 and H3K27me3 ChIP-seq analysis was still not provided. We therefore designed and demonstrated such procedures using ChIP-seq and mRNA-seq data of chicken follicular mesenchymal cells and male germ cells. Methods: Analytical workflow was designed and provided in this study. ChIP-seq and RNA-seq datasets of follicular mesenchymal cells and male germ cells were acquired and properly preprocessed. Peak calling by Model-based analysis of ChIP-seq 2 was performed to identify H3K4m3 or H3K27me3 enriched regions ($Fold-change{\geq}2$, $FDR{\leq}0.01$) in gene promoter regions. Integrative genomics viewer was utilized for cellular retinoic acid binding protein 1 (CRABP1), growth differentiation factor 10 (GDF10), and gremlin 1 (GREM1) gene explorations. Results: The acquired results indicated that follicular mesenchymal cells and germ cells shared several unique gene promoter regions enriched with H3K4me3 (5,704 peaks) and also unique regions of bivalent H3K4m3/H3K27me3 shared between all cell types and germ cells (1,909 peaks). Subsequent observation of follicular mesenchyme-specific genes-CRABP1, GDF10, and GREM1 correctly revealed vigorous transcriptions of these genes in follicular mesenchymal cells. As expected, bivalent H3K4m3/H3K27me3 pattern was manifested in gene promoter regions of germ cells, and thus suspended their transcriptions. Conclusion: According the results, an example of chicken H3K4m3/H3K27me3 ChIP-seq data analysis was successfully demonstrated in this study. Hopefully, the provided methodology should hereby be useful for galline ChIP-seq data analysis in the future.

An Analysis of Big Video Data with Cloud Computing in Ubiquitous City (클라우드 컴퓨팅을 이용한 유시티 비디오 빅데이터 분석)

  • Lee, Hak Geon;Yun, Chang Ho;Park, Jong Won;Lee, Yong Woo
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.45-52
    • /
    • 2014
  • The Ubiquitous-City (U-City) is a smart or intelligent city to satisfy human beings' desire to enjoy IT services with any device, anytime, anywhere. It is a future city model based on Internet of everything or things (IoE or IoT). It includes a lot of video cameras which are networked together. The networked video cameras support a lot of U-City services as one of the main input data together with sensors. They generate huge amount of video information, real big data for the U-City all the time. It is usually required that the U-City manipulates the big data in real-time. And it is not easy at all. Also, many times, it is required that the accumulated video data are analyzed to detect an event or find a figure among them. It requires a lot of computational power and usually takes a lot of time. Currently we can find researches which try to reduce the processing time of the big video data. Cloud computing can be a good solution to address this matter. There are many cloud computing methodologies which can be used to address the matter. MapReduce is an interesting and attractive methodology for it. It has many advantages and is getting popularity in many areas. Video cameras evolve day by day so that the resolution improves sharply. It leads to the exponential growth of the produced data by the networked video cameras. We are coping with real big data when we have to deal with video image data which are produced by the good quality video cameras. A video surveillance system was not useful until we find the cloud computing. But it is now being widely spread in U-Cities since we find some useful methodologies. Video data are unstructured data thus it is not easy to find a good research result of analyzing the data with MapReduce. This paper presents an analyzing system for the video surveillance system, which is a cloud-computing based video data management system. It is easy to deploy, flexible and reliable. It consists of the video manager, the video monitors, the storage for the video images, the storage client and streaming IN component. The "video monitor" for the video images consists of "video translater" and "protocol manager". The "storage" contains MapReduce analyzer. All components were designed according to the functional requirement of video surveillance system. The "streaming IN" component receives the video data from the networked video cameras and delivers them to the "storage client". It also manages the bottleneck of the network to smooth the data stream. The "storage client" receives the video data from the "streaming IN" component and stores them to the storage. It also helps other components to access the storage. The "video monitor" component transfers the video data by smoothly streaming and manages the protocol. The "video translator" sub-component enables users to manage the resolution, the codec and the frame rate of the video image. The "protocol" sub-component manages the Real Time Streaming Protocol (RTSP) and Real Time Messaging Protocol (RTMP). We use Hadoop Distributed File System(HDFS) for the storage of cloud computing. Hadoop stores the data in HDFS and provides the platform that can process data with simple MapReduce programming model. We suggest our own methodology to analyze the video images using MapReduce in this paper. That is, the workflow of video analysis is presented and detailed explanation is given in this paper. The performance evaluation was experiment and we found that our proposed system worked well. The performance evaluation results are presented in this paper with analysis. With our cluster system, we used compressed $1920{\times}1080(FHD)$ resolution video data, H.264 codec and HDFS as video storage. We measured the processing time according to the number of frame per mapper. Tracing the optimal splitting size of input data and the processing time according to the number of node, we found the linearity of the system performance.